Papers
Topics
Authors
Recent
2000 character limit reached

Practical Acceleration of the Condat-Vũ Algorithm (2403.17100v1)

Published 25 Mar 2024 in math.OC

Abstract: The Condat-V~u algorithm is a widely used primal-dual method for optimizing composite objectives of three functions. Several algorithms for optimizing composite objectives of two functions are special cases of Condat-V~u, including proximal gradient descent (PGD). It is well-known that PGD exhibits suboptimal performance, and a simple adjustment to PGD can accelerate its convergence rate from $\mathcal{O}(1/T)$ to $\mathcal{O}(1/T2)$ on convex objectives, and this accelerated rate is optimal. In this work, we show that a simple adjustment to the Condat-V~u algorithm allows it to recover accelerated PGD (APGD) as a special case, instead of PGD. We prove that this accelerated Condat--V~u algorithm achieves optimal convergence rates and significantly outperforms the traditional Condat-V~u algorithm in regimes where the Condat--V~u algorithm approximates the dynamics of PGD. We demonstrate the effectiveness of our approach in various applications in machine learning and computational imaging.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.