Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The global linear convergence rate of the proximal version of the generalized alternating direction method of multipliers for separable convex programming (2202.09610v5)

Published 19 Feb 2022 in math.OC

Abstract: To solve the separable convex optimization problem with linear constraints, Eckstein and Bertsekas introduced the generalized alternating direction method of multipliers (in short, GADMM), which is an efficient and simple acceleration scheme of the aternating direction method of multipliers. Recently, \textbf{Fang et. al} proposed the linearized version of generalized alternating direction method of multipliers (in short, L-GADMM), where one of its subproblems is approximated by a linearization strategy, and proved its worst-case $\mathcal{O}(1/t)$ convergence rate measured by the iteration complexity in both ergodic and nonergodic senses. In this paper, we introduce the doubly linearized version of generalized alternating direction method of multipliers (in short, DL-GADMM), where both the $x$-subproblem and $y$-subproblem are approximated by linearization strategies. Based on the error bound approach, we establish the linear convergence rate of both L-GADMM and DL-GADMM for separable convex optimization problem that the subdifferentials of the underlying functions are piecewise linear multifunctions. The results in this paper extend, generalize and improve some known results in the literature.

Summary

We haven't generated a summary for this paper yet.