Zero of Finite Sum of Maximally Monotone Operators
- The paper introduces solving 0 in A₁(x)+…+Aₙ(x) on Hilbert spaces, laying a foundation for structured convex optimization and monotone inclusions.
- It employs minimal-lifting resolvent splitting that uses recursive resolvent evaluations and fixed-point iterations to guarantee convergence.
- The approach has practical applications in decentralized optimization, multi-block ADMM, imaging, and networked control, demonstrating scalable algorithm design.
A zero of a finite sum of maximally monotone operators concerns solutions to operator inclusions of the form on a real Hilbert space , where are maximally monotone. This class of problems subsumes a wide spectrum of structured convex optimization and monotone inclusion models central to modern analysis and large-scale computational algorithms. The intricate structure of maximally monotone summands, the nontrivial difficulty of evaluating the resolvent of the sum, and the essential role of splitting algorithms in applications ranging from distributed optimization to imaging motivates a comprehensive study of this inclusion.
1. Problem Formulation and Foundational Concepts
Given a real Hilbert space with induced norm , and maximally monotone operators , the central problem is to find such that
Rewriting, the zero set is , which is a closed convex set under standard monotonicity and maximality assumptions.
The resolvent of with stepsize is , a single-valued, firmly nonexpansive operator whenever is maximally monotone. If , denotes the unscaled resolvent.
The sum operator is again maximally monotone (by Rockafellar's theorem, under standard constraint qualifications), yet crucially, explicit computation of the resolvent is intractable save for limited cases, motivating so-called splitting schemes: decompositions utilizing only the resolvents (or proximal operators) of the individual .
2. Minimal-Lifting Resolvent Splitting and Lower Bound Theory
Traditional operator-splitting methods, such as Douglas–Rachford, naturally address ; for , splitting each only once per iteration without auxiliary "lifting" is impossible except in special cases. The minimal-lifting framework asserts:
- Any frugal resolvent splitting for the sum of maximally monotone operators—one which uses each once per iteration—requires a Cartesian product space with .
- This lower bound is unimprovable: there exists, for all , explicit frugal splittings acting on (and no fewer), with each iterate constructed recursively via a sequence of resolvent evaluations and auxiliary variables.
A canonical recursion for employs and auxiliary : then update
and output . This iteration is -averaged and converges weakly to a fixed point encoding a solution to the original inclusion. When , this specialization reduces to the classical Douglas–Rachford splitting on (Malitsky et al., 2021).
3. Fixed-Point Theory and Convergence Analysis
The minimal-lifting resolvent splitting operator defined above satisfies:
- is -averaged for ,
- if and only if ,
- Any maps to a common value with .
The sequence converges weakly to , and the associated all converge to the same . If are uniformly monotone, strong convergence can be asserted in the limiting regime as in the Peaceman–Rachford variant.
The dimension bound (lifting degree ) is established by formalizing the dependency of each on current and preceding and , then arguing via a structural matrix form and a rank calculation that degrees of freedom are essential for unrestricted maximally monotone inputs (Malitsky et al., 2021).
4. Operator Splitting in the Finite Sum Regime
Several extensions and alternative schemes exist for addressing finite sums of maximally monotone operators:
Product-Space Reformulation: Cast as on , where (the diagonal). The normal cone resolves consensus. Douglas–Rachford and related 2-operator splittings then apply in this higher-dimensional setting (Bot et al., 2012, Chen et al., 2022).
Primal-Dual and Forward-Backward(-Forward) Splitting: Extensions to problems with both maximally monotone and Lipschitzian (possibly single-valued) summands employ composite or hybrid forward-backward, reflected-backward, or forward-reflected-backward schemes, potentially using variable metrics. These approaches enable splitting algorithms to handle composite, sum-of-composite, or distributed settings with efficient per-iteration complexity and broad convergence guarantees (Bot et al., 2012, Vũ, 2012, Tam et al., 14 Dec 2025).
Adaptive and Relaxed Variants: In the presence of monotonicity constants (e.g., strong/weak monotonicity, Lipschitz continuity), adaptive Douglas–Rachford designs modulate reflection and averaging parameters to restore nonexpansivity (or contraction) and guarantee global or even linear convergence, particularly in two-operator settings but with partial extensions to sums (Dao et al., 2018).
5. Application Domains and Distributed/Decentralised Optimization
Frugal, minimal-lifting splittings have direct applications to:
- Decentralized optimization over networked agents: Each operator may correspond to an agent's local objective or constraint. Minimal-lifting schemes assign local variables () to agents, require only communication among neighbors (e.g., on cycle graphs), and evaluate each precisely once per iteration without a central coordinator. Convergence and low per-iteration complexity are provable, and network-step-size independence can be attained in specifically designed forward-backward-type algorithms (Malitsky et al., 2021, Tam et al., 14 Dec 2025).
- Multi-block ADMM: For multi-block linearly constrained convex programs, the dual inclusion with maximally monotone admits resolution via n-operator splitting as above, yielding convergent multi-block extensions to ADMM. In the generic case , this recovers the (convergent) two-block ADMM/Douglas–Rachford on the dual; for , it provides new guarantees not enjoyed by the standard direct multi-block ADMM (Malitsky et al., 2021).
- Composite monotone inclusion: Inclusions with linear composition and block structure are amenable to minimal-lifting resolvent splittings where each operator and each application of the linear map and its adjoint is invoked only once per iteration. This is particularly vital when has large norm or is expensive to apply (Briceño-Arias, 2021).
- Structured convex minimization, monotone games, networked Nash equilibria: Product-space and minimal-lifting splitting techniques underlie efficient solvers for (possibly non-differentiable) convex programs in imaging, regression, location theory, and distributed control (Bot et al., 2012, Tam et al., 14 Dec 2025).
6. Duality Frameworks, Extended Solution Concepts, and Generalization
Attouch–Théra duality reveals deep connections between solutions of and , with the dual operator . Existence of primal and dual solutions, paramonotonicity, and the structure of extended solution sets (as in the graph of ) enable recovery of all primal solutions from a single dual element and vice versa (Bauschke et al., 2011). These dual interpretations carry over to the finite sum regime through appropriate product-space constructs.
Normal problem and generalized zeros: For potentially inconsistent inclusions (), the normal problem framework introduces a systematic perturbation—using the infimal displacement vector of the associated Douglas–Rachford splitting operator—to define a "least-distance" generalized solution set that reduces to the original if solutions exist, but is always nonempty by construction (Bauschke et al., 2013). This unifies classical least squares, best approximation problems, and infeasibility regularization under a common normal equations paradigm.
7. Methodological Table: Minimal-Lifting Frugal Splittings and Operator Splitting Strategies
| Splitting Method | Operator Class | Lifting Dimension | Main Iteration |
|---|---|---|---|
| Minimal-lifting resolvent split | maximally monotone | Recursive | |
| Douglas–Rachford | |||
| Primal-dual product-space | maximally monotone | Product-space splitting, consensus constraint | |
| Decentralized FBS/FBF | Sum plus Lipschitzian | or $2N$ (agents) | Forward-backward (possibly extragradient) |
| Multi-block ADMM (dual) | subdifferential | or as above | Dual splitting; primal-dual updates |
The breadth of monotone inclusion splitting techniques for the zeros of finite sums of maximally monotone operators reflects the deep interplay between operator-theoretic structural properties, fixed-point and duality frameworks, and the design of scalable, distributed, and provably convergent algorithms in applied computation (Malitsky et al., 2021, Tam et al., 14 Dec 2025, Bot et al., 2012, Briceño-Arias, 2021, Vũ, 2012, Dao et al., 2018, Bauschke et al., 2013, Bauschke et al., 2011, Chen et al., 2022).