Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerated Over-Relaxation Heavy-Ball Methods with Provable Acceleration and Global Convergence (2406.09772v1)

Published 14 Jun 2024 in math.OC

Abstract: The heavy-ball momentum method has gained widespread popularity for accelerating gradient descent by incorporating a momentum term. Recent studies have conclusively shown that the heavy-ball method cannot achieve an accelerated convergence rate for general smooth strongly convex optimization problems. This work introduces the Accelerated Over-Relaxation Heavy-Ball (AOR-HB) method, a novel approach that represents the first heavy-ball method to demonstrate provable global and accelerated convergence for smooth strongly convex optimization. The key innovation of the AOR-HB method lies in the application of an over-relaxation technique to the gradient term. This novel approach enables the method to be applied to min-max problems and meet optimal lower complexity bounds. This breakthrough addresses a long-standing theoretical gap in heavy-ball momentum methods and paves the way for developing accelerated methods that transcend the boundaries of convex optimization to non-convex optimization. Numerical experiments validate the effectiveness of the proposed algorithms, with their performance matching that of other leading first-order optimization methods.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com