Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On finite termination of quasi-Newton methods on quadratic problems (2407.03072v3)

Published 3 Jul 2024 in math.OC

Abstract: Quasi-Newton methods form an important class of methods for solving nonlinear optimization problems. In such methods, first order information is used to approximate the second derivative. The aim is to mimic the fast convergence that can be guaranteed by Newton-based methods. In the best case, quasi-Newton methods will far outperform steepest descent and other first order methods, without the computational cost of calculating the exact second derivative. These convergence guarantees hold locally, which follows closely from the fact that, if the objective function is strongly convex, it can be approximated well by a quadratic function close to the solution. Understanding the performance of quasi-Newton methods on quadratic problems with a symmetric positive definite Hessian is therefore of vital importance. In the classic case, an approximation of the Hessian is updated at every iteration and exact line search is used. It is well known that the algorithm terminates finitely, even when the Hessian approximation is memoryless, i.e. requires only the most recent information. In this paper, we explore the possibilities in which reliance on exact line search and dependence on conjugate search directions can be relaxed, while preserving finite termination properties of quasi-Newton methods on quadratic problems. We show that it suffices to create a memoryless quasi-Newton matrix based on two vectors to give ability to compute a Newton direction within a finite number of iterations, independent of step lengths. It is unnecessary for the quasi-Newton approximation to act as the Hessian on the full space.

Summary

We haven't generated a summary for this paper yet.