Papers
Topics
Authors
Recent
Search
2000 character limit reached

The global linear convergence rate of the proximal version of the generalized alternating direction method of multipliers for separable convex programming

Published 19 Feb 2022 in math.OC | (2202.09610v5)

Abstract: To solve the separable convex optimization problem with linear constraints, Eckstein and Bertsekas introduced the generalized alternating direction method of multipliers (in short, GADMM), which is an efficient and simple acceleration scheme of the aternating direction method of multipliers. Recently, \textbf{Fang et. al} proposed the linearized version of generalized alternating direction method of multipliers (in short, L-GADMM), where one of its subproblems is approximated by a linearization strategy, and proved its worst-case $\mathcal{O}(1/t)$ convergence rate measured by the iteration complexity in both ergodic and nonergodic senses. In this paper, we introduce the doubly linearized version of generalized alternating direction method of multipliers (in short, DL-GADMM), where both the $x$-subproblem and $y$-subproblem are approximated by linearization strategies. Based on the error bound approach, we establish the linear convergence rate of both L-GADMM and DL-GADMM for separable convex optimization problem that the subdifferentials of the underlying functions are piecewise linear multifunctions. The results in this paper extend, generalize and improve some known results in the literature.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.