Papers
Topics
Authors
Recent
Search
2000 character limit reached

Relaxed Weak Accelerated Proximal Gradient Method: a Unified Framework for Nesterov's Accelerations

Published 9 Apr 2025 in math.OC | (2504.06568v1)

Abstract: This paper is devoted to the study of accelerated proximal gradient methods where the sequence that controls the momentum term doesn't follow Nesterov's rule. We propose a relaxed weak accelerated proximal gradient (R-WAPG) method, a generic algorithm that unifies the convergence results for strongly convex and convex problems where the extrapolation constant is characterized by a sequence that is much weaker than Nesterov's rule. Our R-WAPG provides a unified framework for several notable Euclidean variants of FISTA and verifies their convergences. In addition, we provide the convergence rate of strongly convex objective with a constant momentum term. Without using the idea of restarting, we also reformulate R-WAPG as ``Free R-WAPG" so that it doesn't require any parameter. Explorative numerical experiments were conducted to show its competitive advantages.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 3 likes about this paper.