Modified limited memory BFGS with displacement aggregation and its application to the largest eigenvalue problem (2301.05447v3)
Abstract: We present a modified limited memory BFGS method with displacement aggregation (AggMBFGS) for solving nonconvex optimization problems. AggMBFGS refines curvature pair updates by removing linearly dependent variable variations, ensuring that the inverse Hessian approximation retains essential curvature properties. As a result, its per iteration complexity and storage requirement is $\mathcal{O}(\tau d)$ where $\tau \leq d$ represents the memory size and $d$ is the problem dimension. We establish the global convergence of both M-LBFGS and AggMBFGS under a backtracking modified Armijo line search (MALS) and prove the local superlinear convergence of AggMBFGS, demonstrating its theoretical advantages over M-LBFGS with the classical Armijo line search~\cite{Shi2016ALM}. Numerical experiments on CUTEst test problems~\cite{gould2015cutest} confirm that AggMBFGS outperforms M-LBFGS in reducing the number of iterations and function evaluations. Additionally, we apply AggMBFGS to compute the largest eigenvalue of high-dimensional real symmetric positive definite matrices, achieving lower relative errors than M-LBFGS~\cite{Shi2016ALM} while maintaining computational efficiency. These results suggest that AggMBFGS is a promising alternative for large-scale nonconvex optimization and eigenvalue computation.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.