Papers
Topics
Authors
Recent
2000 character limit reached

On finite termination of quasi-Newton methods on quadratic problems

Published 3 Jul 2024 in math.OC | (2407.03072v3)

Abstract: Quasi-Newton methods form an important class of methods for solving nonlinear optimization problems. In such methods, first order information is used to approximate the second derivative. The aim is to mimic the fast convergence that can be guaranteed by Newton-based methods. In the best case, quasi-Newton methods will far outperform steepest descent and other first order methods, without the computational cost of calculating the exact second derivative. These convergence guarantees hold locally, which follows closely from the fact that, if the objective function is strongly convex, it can be approximated well by a quadratic function close to the solution. Understanding the performance of quasi-Newton methods on quadratic problems with a symmetric positive definite Hessian is therefore of vital importance. In the classic case, an approximation of the Hessian is updated at every iteration and exact line search is used. It is well known that the algorithm terminates finitely, even when the Hessian approximation is memoryless, i.e. requires only the most recent information. In this paper, we explore the possibilities in which reliance on exact line search and dependence on conjugate search directions can be relaxed, while preserving finite termination properties of quasi-Newton methods on quadratic problems. We show that it suffices to create a memoryless quasi-Newton matrix based on two vectors to give ability to compute a Newton direction within a finite number of iterations, independent of step lengths. It is unnecessary for the quasi-Newton approximation to act as the Hessian on the full space.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.