Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Unified Alternating Direction Method of Multipliers by Majorization Minimization (1607.02584v1)

Published 9 Jul 2016 in cs.NA

Abstract: Accompanied with the rising popularity of compressed sensing, the Alternating Direction Method of Multipliers (ADMM) has become the most widely used solver for linearly constrained convex problems with separable objectives. In this work, we observe that many previous variants of ADMM update the primal variable by minimizing different majorant functions with their convergence proofs given case by case. Inspired by the principle of majorization minimization, we respectively present the unified frameworks and convergence analysis for the Gauss-Seidel ADMMs and Jacobian ADMMs, which use different historical information for the current updating. Our frameworks further generalize previous ADMMs to the ones capable of solving the problems with non-separable objectives by minimizing their separable majorant surrogates. We also show that the bound which measures the convergence speed of ADMMs depends on the tightness of the used majorant function. Then several techniques are introduced to improve the efficiency of ADMMs by tightening the majorant functions. In particular, we propose the Mixed Gauss-Seidel and Jacobian ADMM (M-ADMM) which alleviates the slow convergence issue of Jacobian ADMMs by absorbing merits of the Gauss-Seidel ADMMs. M-ADMM can be further improved by using backtracking, wise variable partition and fully exploiting the structure of the constraint. Beyond the guarantee in theory, numerical experiments on both synthesized and real-world data further demonstrate the superiority of our new ADMMs in practice. Finally, we release a toolbox at https://github.com/canyilu/LibADMM that implements efficient ADMMs for many problems in compressed sensing.

Citations (133)

Summary

  • The paper proposes a unified framework for the Alternating Direction Method of Multipliers (ADMM) based on Majorization Minimization, covering both Gauss-Seidel and Jacobian variants for convex optimization.
  • It provides a unified convergence proof showing an O(1/K) rate for both ADMM variants, linking convergence speed directly to the tightness of the majorant functions used.
  • The work introduces an improved Mixed Gauss-Seidel and Jacobian ADMM (M-ADMM) variant, techniques like backtracking to enhance efficiency, and releases a toolkit (LibADMM) for practical application.

Unified Alternating Direction Method of Multipliers by Majorization Minimization

The paper proposes a unified framework for the Alternating Direction Method of Multipliers (ADMM) aimed at solving linearly constrained convex optimization problems with separable objectives. ADMM has been a popular method in the context of compressed sensing and other applications like image processing and computer vision. This work brings a novel perspective by employing a Majorization Minimization (MM) approach and presents frameworks for both Gauss-Seidel and Jacobian variants of ADMM.

Key Contributions

  1. Unified Framework with Majorization Minimization: The authors introduce a majorant first-order surrogate concept to unify and generalize various ADMM methods. This approach allows updating the primal variables using different majorant functions, accommodating both separable and non-separable objectives.
  2. Convergence Analysis: The paper provides a unified convergence proof for both Gauss-Seidel and Jacobian ADMMs, showing an O(1/K)O(1/K) convergence rate. The analysis is based on the tightness of the used majorant functions, offering insight into how the tightness affects convergence speed.
  3. Improved ADMM Variants: A new Mixed Gauss-Seidel and Jacobian ADMM (M-ADMM) is proposed, which enhances convergence by combining the strengths of both approaches. Various techniques such as backtracking and wise variable partitioning are introduced to tighten the majorant functions further and improve efficiency.
  4. Practical Implementations and Tool Release: Through numerical experiments, the paper demonstrates the effectiveness of the new ADMM methods on both synthetic and real-world data. The authors also release a toolkit, LibADMM, to facilitate the application of these findings across different compressed sensing problems.

Impact and Implications

The paper's unified framework offers a versatile solution suite that can handle a broader class of optimization problems, including those with non-separable objectives which were previously challenging for ADMM. This generalization not only simplifies the convergence analysis but also provides practical guidelines for implementing more efficient ADMM algorithms.

The introduction of M-ADMM, particularly with backtracking, enhances the flexibility and efficiency of solving real-world problems, as demonstrated in scenarios like image reconstruction and matrix completion. The tool release further enables accessibility and application across various domains.

For future developments, this framework could be extended to nonconvex problems, demanding new insights into surrogate function design and convergence analysis. Additionally, adapting these methods for large-scale distributed systems could further broaden their applicability.

In conclusion, this work fundamentally enriches the landscape of optimization techniques by connecting MM with ADMM, providing both theoretical advancements and practical tools for solving complex optimization problems efficiently.

Github Logo Streamline Icon: https://streamlinehq.com