Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Rates of superlinear convergence for classical quasi-Newton methods (2003.09174v3)

Published 20 Mar 2020 in math.OC

Abstract: We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form $(\frac{n L2}{\mu2 k}){k/2}$ and $(\frac{n L}{\mu k}){k/2}$ respectively, where $k$ is the iteration counter, $n$ is the dimension of the problem, $\mu$ is the strong convexity parameter, and $L$ is the Lipschitz constant of the gradient.

Citations (54)

Summary

We haven't generated a summary for this paper yet.