Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping (2107.05943v1)

Published 13 Jul 2021 in math.OC

Abstract: In a Hilbert space setting, for convex optimization, we show the convergence of the iterates to optimal solutions for a class of accelerated first-order algorithms. They can be interpreted as discrete temporal versions of an inertial dynamic involving both viscous damping and Hessian-driven damping. The asymptotically vanishing viscous damping is linked to the accelerated gradient method of Nesterov while the Hessian driven damping makes it possible to significantly attenuate the oscillations. By treating the Hessian-driven damping as the time derivative of the gradient term, this gives, in discretized form, first-order algorithms. These results complement the previous work of the authors where it was shown the fast convergence of the values, and the fast convergence towards zero of the gradients.

Summary

We haven't generated a summary for this paper yet.