A Dynamic Subspace Based BFGS Method for Large Scale Optimization Problem
Abstract: Large-scale unconstrained optimization is a fundamental and important class of, yet not well-solved problems in numerical optimization. The main challenge in designing an algorithm is to require a few storage locations or very inexpensive computations while preserving global convergence. In this work, we propose a novel approach solving large-scale unconstrained optimization problem by combining the dynamic subspace technique and the BFGS update algorithm. It is clearly demonstrated that our approach has the same rate of convergence in the dynamic subspace as the BFGS and less memory than L-BFGS. Further, we give the convergence analysis by constructing the mapping of low-dimensional Euclidean space to the adaptive subspace. We compare our hybrid algorithm with the BFGS and L-BFGS approaches. Experimental results show that our hybrid algorithm offers several significant advantages such as parallel computing, convergence efficiency, and robustness.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.