On the Sublinear Convergence Rate of Multi-Block ADMM
Abstract: The alternating direction method of multipliers (ADMM) is widely used in solving structured convex optimization problems. Despite of its success in practice, the convergence properties of the standard ADMM for minimizing the sum of $N$ $(N\geq 3)$ convex functions with $N$ block variables linked by linear constraints, have remained unclear for a very long time. In this paper, we present convergence and convergence rate results for the standard ADMM applied to solve $N$-block $(N\geq 3)$ convex minimization problem, under the condition that one of these functions is convex (not necessarily strongly convex) and the other $N-1$ functions are strongly convex. Specifically, in that case the ADMM is proven to converge with rate $O(1/t)$ in a certain ergodic sense, and $o(1/t)$ in non-ergodic sense, where $t$ denotes the number of iterations. As a by-product, we also provide a simple proof for the $O(1/t)$ convergence rate of two-block ADMM in terms of both objective error and constraint violation, without assuming any condition on the penalty parameter and strong convexity on the functions.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.