Bregman Peaceman-Rachford Splitting Method
- Bregman Peaceman-Rachford Splitting Method is an operator splitting algorithm that uses Bregman distances to extend classical methods to non-Euclidean geometries.
- It employs reflection operators based on strictly convex Legendre functions to efficiently solve monotone inclusions and structured convex optimization problems.
- Its applications span discrete optimal transport to structured convex programming, offering scalable and symmetric update schemes.
The Bregman Peaceman-Rachford Splitting Method (BPRS) is an operator splitting algorithm for monotone inclusions and structured convex optimization problems. By leveraging Bregman distances induced by strictly convex Legendre functions, BPRS generalizes classical Peaceman–Rachford splitting to non-Euclidean geometries. This makes BPRS particularly well-suited for problems where the underlying function structure or viable constraint set is naturally non-quadratic, such as in probability simplex or entropy-based models. BPRS appears as a principal variant in the literature and is closely related to Bregman Douglas–Rachford splitting (BDRS), Bregman ADMM, and the exponential multiplier method (Ma et al., 10 Sep 2025). The following sections detail its definition, mathematical structure, relations to other splitting methods, convergence criteria, key applications, and implications for algorithmic and theoretical advances.
1. Mathematical Definition and Algorithmic Structure
The Bregman Peaceman-Rachford Splitting Method extends Peaceman-Rachford operator splitting by replacing the standard quadratic norm penalty with a Bregman divergence generated by a Legendre function , assumed to be proper, strictly convex, lower semicontinuous, and essentially smooth. The Bregman distance is
Given a maximal monotone inclusion problem,
where and are maximal monotone operators, the BPRS iteration uses the Bregman resolvent operator
and its associated Bregman reflection operator
for being either or and the gradient of the convex conjugate of .
The BPRS update is then
where is a step size, typically required to satisfy problem-dependent upper bounds for convergence.
For , these updates reduce to the classical Peaceman–Rachford method in the Euclidean (quadratic) setting.
2. Relations to Bregman Douglas–Rachford Splitting and Bregman ADMM
BPRS is closely related to Bregman Douglas–Rachford Splitting (BDRS) and Bregman ADMM. The BDRS update involves sequential Bregman-resolvent steps and a Bregman Mann averaging operator:
BPRS can be viewed as the "fully symmetric" variant, employing two reflections in succession rather than a weighted average, resulting in
When specialized to dual problems arising in conic-structured primal constraints (such as ), the BPRS update coincides with a symmetric Bregman ADMM scheme, involving two multipliers updates per iteration. This equivalence is crucial for interpreting the algorithm within the broader context of operator splitting, semidefinite programming relaxations, and optimal transport computations. Notably, in applications like optimal transport, the ADEMM (alternating direction exponential multiplier method) appears as a particular case.
3. Convergence Theory and Required Assumptions
Rigorous convergence analysis for BPRS depends on characteristics of , , and . Two main settings are addressed:
- Relative Smoothness: If and (when , ) satisfy relative smoothness with respect to —i.e., for some , —then for , globally convergent fixed-point iterations are ensured.
- Nonsmooth Setting: When and may be nonsmooth or only convex, additional conditions are required:
- Image of must lie in the intersection of domains of and ,
- Uniform boundedness of subgradients,
- Strong convexity of on the relevant domain (e.g., the positive simplex for entropy),
- Lipschitz continuity of .
Under these, sublinear rates of decrease in are obtained; specifically, for decaying step size , convergence to follows via the telescoping inequality on Bregman distances.
In all cases, the fixed-point set is a subset of the solution set .
4. Algorithmic Properties and Practical Considerations
BPRS operates via Bregman reflections rather than proximal mappings, which enhances its flexibility in non-Euclidean domains (such as probability simplex or entropy-regularized spaces). For quadratic , all reflection and resolvent operators are explicit; for entropy or generalized , operators may require soft-thresholding or Sinkhorn-type scaling steps.
Parameter selection, particularly of the step sizes , is highly problem-dependent. For convex composite objectives, careful tuning of is required to ensure contractivity and avoid divergence. Structural properties of the Bregman generator , especially its strong convexity and domain compatibility, directly impact the admissible range of parameters.
A plausible implication is that applications with prior information about the metric structure or distributional constraints (e.g., categorical data, transport polytope) can exploit BPRS more naturally than classical quadratic-based methods.
5. Applications: Discrete Optimal Transport and Structured Convex Optimization
BPRS and related Bregman splitting algorithms have been applied to:
- Discrete Optimal Transport (OT): By formulating the OT problem as an inclusion between normal cone operators and affine constraints, BPRS (and BDRS) yield iterative element-wise update schemes that avoid full projection onto the simplex at each iteration. Compared to Sinkhorn–Knopp, BPRS offers computational advantages in entropy-regularized or general Bregman regularized settings. However, the paper notes that in some cases, required assumptions—such as image of lying in operator domain—may not strictly hold, so convergence proofs may not directly apply.
- Structured Convex Optimization: For problems with conic or affinity-type constraints and composite objectives, BPRS delivers a symmetric splitting scheme, inheriting desirable properties such as unconditional convergence under proper parameter regimes.
The algorithmic structure often leads to scalable implementations for networked or distributed environments, semidefinite relaxations, and imaging tasks, given that Bregman geometries can encode both sparsity and smoothness in the underlying solution space.
6. Comparison with Classical Peaceman–Rachford and Douglas–Rachford Methods
When , BPRS coincides with classical PRS. In contrast to Douglas–Rachford, Peaceman–Rachford utilizes two unaveraged reflections per step, which can theoretically deliver faster local progress but requires stricter regularity for unconditional convergence. In Bregman settings, both DRS and PRS admit variants as BDRS and BPRS respectively, with the choice of geometry rendering certain problems more tractable or naturally aligned with the data.
The symmetric step structure of BPRS, and its affinity to Bregman ADMM, creates opportunities for constructing algorithms that are highly suited to dual and primal–dual formulations, particularly when dual variables admit Bregman-geometric representation.
7. Summary and Outlook
Bregman Peaceman–Rachford Splitting generalizes classical operator splitting schemes to settings where the quadratic norm is insufficient to capture problem geometry. By composing Bregman reflection operators, BPRS provides a powerful symmetry-based iterative splitting mechanism for monotone inclusions and structured optimization tasks, underpinned by explicit convergence theory for both relatively smooth and nonsmooth cases (Ma et al., 10 Sep 2025). Notably, its equivalence with symmetric Bregman ADMM and its applicability to OT and structured convex programs highlight its versatility.
Key practical guidelines include:
- Careful selection of Bregman generator matched to data and constraints,
- Adherence to global parameter bounds for step sizes,
- Verification of regularity and domain requirements when extending to new applications (especially in non-quadratic and entropy-regularized problems).
Ongoing research addresses extensions to non-separable multi-block structures, adaptive parameter tuning, and inexact updating schemes, with particular emphasis on convergence in applied domains where non-Euclidean geometries dominate.