Inexact Projected Preconditioned Gradient Methods with Variable Metrics: General Convergence Theory via Lyapunov Approach (2506.03671v1)
Abstract: Projected gradient methods are widely used for constrained optimization. A key application is for partial differential equations (PDEs), where the objective functional represents physical energy and the linear constraints enforce conservation laws. However, computing the projections onto the constraint set generally requires solving large-scale ill-conditioned linear systems. A common strategy is to relax projection accuracy and apply preconditioners, which leads to the inexact preconditioned projected gradient descent (IPPGD) methods studied here. However, the theoretical analysis and the dynamic behavior of the IPPGD methods, along with an effective construction of the inexact projection operator itself, all remain largely unexplored. We propose a strategy for constructing the inexact projection operator and develop a gradient-type flow to model the IPPGD methods. Discretization of this flow not only recovers the original IPPGD method but also yields a potentially faster novel method. Furthermore, we apply Lyapunov analysis, designing a delicate Lyapunov function, to prove the exponential convergence at the continuous level and linear convergence at the discrete level. We then apply the proposed method to solve nonlinear PDEs and present numerical results.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.