Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Novel Limited Memory Quasi-Newton Methods Based On Optimal Matrix Approximation (2403.02860v1)

Published 5 Mar 2024 in math.OC

Abstract: Update formulas for the Hessian approximations in quasi-Newton methods such as BFGS can be derived as analytical solutions to certain nearest-matrix problems. In this article, we propose a similar idea for deriving new limited memory versions of quasi-Newton methods. Most limited memory quasi-Newton methods make use of Hessian approximations that can be written as a scaled identity matrix plus a symmetric matrix with limited rank. We derive a way of finding the nearest matrix of this type to an arbitrary symmetric matrix, in either the Frobenius norm, the induced $l2$ norm, or a dissimilarity measure for positive definite matrices in terms of trace and determinant. In doing so, we lay down a framework for more general matrix optimization problems with unitarily invariant matrix norms and arbitrary constraints on the set of eigenvalues. We then propose a trust region method in which the Hessian approximation, after having been updated by a Broyden class formula and used to solve a trust-region problem, is replaced by one of its closest limited memory approximations. We propose to store the Hessian approximation in terms of its eigenvectors and eigenvalues in a way that completely defines its eigenvalue decomposition, as this simplifies both the solution of the trust region subproblem and the nearest limited memory matrix problem. Our method is compared to a reference trust region method with the usual limited memory BFGS updates, and is shown to require fewer iterations and the storage of fewer vectors for a variety of test problems.

Citations (1)

Summary

We haven't generated a summary for this paper yet.