Papers
Topics
Authors
Recent
2000 character limit reached

Shadow-point Enhanced Inexact Accelerated Proximal Gradient Method with Preserved Convergence Guarantees (2504.20453v1)

Published 29 Apr 2025 in math.OC

Abstract: We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex function using the inexact accelerated proximal gradient (APG) method. A key limitation of existing approaches is their reliance on feasible approximate solutions to subproblems, which is often computationally expensive or even unrealistic to obtain in practice. To address this limitation, we develop a shadow-point enhanced inexact accelerated proximal gradient method (SpinAPG), which can eliminate the feasibility requirement while preserving all desirable convergence properties of the APG method, including the iterate convergence and an $o(1/k2)$ convergence rate for the objective function value, under suitable summable-error conditions. Our method also provides a more flexible and computationally efficient inexact framework for the APG method, with a fairly easy-to-implement error criterion. Finally, we demonstrate the practical advantages of our SpinAPG through numerical experiments on a relaxation of the quadratic assignment problem, showcasing its effectiveness while bypassing the explicit computation of a feasible point.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.