Papers
Topics
Authors
Recent
Search
2000 character limit reached

Modified limited memory BFGS with displacement aggregation and its application to the largest eigenvalue problem

Published 13 Jan 2023 in math.OC | (2301.05447v3)

Abstract: We present a modified limited memory BFGS method with displacement aggregation (AggMBFGS) for solving nonconvex optimization problems. AggMBFGS refines curvature pair updates by removing linearly dependent variable variations, ensuring that the inverse Hessian approximation retains essential curvature properties. As a result, its per iteration complexity and storage requirement is $\mathcal{O}(\tau d)$ where $\tau \leq d$ represents the memory size and $d$ is the problem dimension. We establish the global convergence of both M-LBFGS and AggMBFGS under a backtracking modified Armijo line search (MALS) and prove the local superlinear convergence of AggMBFGS, demonstrating its theoretical advantages over M-LBFGS with the classical Armijo line search~\cite{Shi2016ALM}. Numerical experiments on CUTEst test problems~\cite{gould2015cutest} confirm that AggMBFGS outperforms M-LBFGS in reducing the number of iterations and function evaluations. Additionally, we apply AggMBFGS to compute the largest eigenvalue of high-dimensional real symmetric positive definite matrices, achieving lower relative errors than M-LBFGS~\cite{Shi2016ALM} while maintaining computational efficiency. These results suggest that AggMBFGS is a promising alternative for large-scale nonconvex optimization and eigenvalue computation.

Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.