Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 52 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Bregman Peaceman-Rachford Splitting Method

Updated 13 September 2025
  • Bregman Peaceman-Rachford Splitting Method is an operator splitting algorithm that uses Bregman distances to extend classical methods to non-Euclidean geometries.
  • It employs reflection operators based on strictly convex Legendre functions to efficiently solve monotone inclusions and structured convex optimization problems.
  • Its applications span discrete optimal transport to structured convex programming, offering scalable and symmetric update schemes.

The Bregman Peaceman-Rachford Splitting Method (BPRS) is an operator splitting algorithm for monotone inclusions and structured convex optimization problems. By leveraging Bregman distances induced by strictly convex Legendre functions, BPRS generalizes classical Peaceman–Rachford splitting to non-Euclidean geometries. This makes BPRS particularly well-suited for problems where the underlying function structure or viable constraint set is naturally non-quadratic, such as in probability simplex or entropy-based models. BPRS appears as a principal variant in the literature and is closely related to Bregman Douglas–Rachford splitting (BDRS), Bregman ADMM, and the exponential multiplier method (Ma et al., 10 Sep 2025). The following sections detail its definition, mathematical structure, relations to other splitting methods, convergence criteria, key applications, and implications for algorithmic and theoretical advances.

1. Mathematical Definition and Algorithmic Structure

The Bregman Peaceman-Rachford Splitting Method extends Peaceman-Rachford operator splitting by replacing the standard quadratic norm penalty with a Bregman divergence generated by a Legendre function h:RdRh:\mathbb{R}^d \rightarrow \mathbb{R}, assumed to be proper, strictly convex, lower semicontinuous, and essentially smooth. The Bregman distance is

Dh(x,y)=h(x)h(y)h(y),xyD_h(x,y) = h(x) - h(y) - \langle \nabla h(y), x - y \rangle

Given a maximal monotone inclusion problem,

0A(x)+B(x),0 \in A(x) + B(x),

where AA and BB are maximal monotone operators, the BPRS iteration uses the Bregman resolvent operator

JTh(z)=(h+T)1(h(z))J_T^h(z) = (\nabla h + T)^{-1}(\nabla h(z))

and its associated Bregman reflection operator

RTh(z)=h(2h(JTh(z))h(z))R_T^h(z) = \nabla h^*\left(2 \nabla h(J_T^h(z)) - \nabla h(z)\right)

for TT being either AA or BB and h\nabla h^* the gradient of the convex conjugate of hh.

The BPRS update is then

zk+1=RγkAh(RγkBh(zk))z^{k+1} = R_{γ_kA}^h\left(R_{γ_kB}^h(z^k)\right)

where γkγ_k is a step size, typically required to satisfy problem-dependent upper bounds for convergence.

For h(x)=12x2h(x) = \frac{1}{2}\|x\|^2, these updates reduce to the classical Peaceman–Rachford method in the Euclidean (quadratic) setting.

2. Relations to Bregman Douglas–Rachford Splitting and Bregman ADMM

BPRS is closely related to Bregman Douglas–Rachford Splitting (BDRS) and Bregman ADMM. The BDRS update involves sequential Bregman-resolvent steps and a Bregman Mann averaging operator:

xk=JγkBh(zk), yk=JγkAh(h(2h(xk)h(zk))), zk+1=h(h(zk)h(xk)+h(yk))\begin{aligned} x^k &= J_{γ_k B}^h(z^k), \ y^k &= J_{γ_k A}^h\left(\nabla h^* (2\nabla h(x^k) - \nabla h(z^k))\right), \ z^{k+1} &= \nabla h^*\left(\nabla h(z^k) - \nabla h(x^k) + \nabla h(y^k)\right) \end{aligned}

BPRS can be viewed as the "fully symmetric" variant, employing two reflections in succession rather than a weighted average, resulting in

zk+1=RγkAhRγkBh(zk)z^{k+1} = R_{γ_k A}^h R_{γ_k B}^h (z^k)

When specialized to dual problems arising in conic-structured primal constraints (such as Mu+Nv=bM u + N v = b), the BPRS update coincides with a symmetric Bregman ADMM scheme, involving two multipliers updates per iteration. This equivalence is crucial for interpreting the algorithm within the broader context of operator splitting, semidefinite programming relaxations, and optimal transport computations. Notably, in applications like optimal transport, the ADEMM (alternating direction exponential multiplier method) appears as a particular case.

3. Convergence Theory and Required Assumptions

Rigorous convergence analysis for BPRS depends on characteristics of AA, BB, and hh. Two main settings are addressed:

  • Relative Smoothness: If ff and gg (when A=fA = \partial f, B=gB = \partial g) satisfy relative smoothness with respect to DhD_h—i.e., for some L>0L > 0, f(x)f(y)+f(y),xy+LDh(x,y)f(x) \leq f(y) + \langle \nabla f(y), x-y \rangle + L D_h(x,y)—then for γk1/Lγ_k \leq 1/L, globally convergent fixed-point iterations are ensured.
  • Nonsmooth Setting: When ff and gg may be nonsmooth or only convex, additional conditions are required:
    • Image of h\nabla h^* must lie in the intersection of domains of ff and gg,
    • Uniform boundedness of subgradients,
    • Strong convexity of hh on the relevant domain (e.g., the positive simplex for entropy),
    • Lipschitz continuity of h\nabla h^*.

Under these, sublinear rates of decrease in Dh(zk,z)D_h(z^k, z^*) are obtained; specifically, for decaying step size γk=1/kγ_k = 1/\sqrt{k}, convergence to (A+B)1(0)(A+B)^{-1}(0) follows via the telescoping inequality on Bregman distances.

In all cases, the fixed-point set Fix(RAhRBh)\operatorname{Fix}(R_A^h R_B^h) is a subset of the solution set (A+B)1(0)(A+B)^{-1}(0).

4. Algorithmic Properties and Practical Considerations

BPRS operates via Bregman reflections rather than proximal mappings, which enhances its flexibility in non-Euclidean domains (such as probability simplex or entropy-regularized spaces). For quadratic hh, all reflection and resolvent operators are explicit; for entropy or generalized hh, operators may require soft-thresholding or Sinkhorn-type scaling steps.

Parameter selection, particularly of the step sizes γkγ_k, is highly problem-dependent. For convex composite objectives, careful tuning of γkγ_k is required to ensure contractivity and avoid divergence. Structural properties of the Bregman generator hh, especially its strong convexity and domain compatibility, directly impact the admissible range of parameters.

A plausible implication is that applications with prior information about the metric structure or distributional constraints (e.g., categorical data, transport polytope) can exploit BPRS more naturally than classical quadratic-based methods.

5. Applications: Discrete Optimal Transport and Structured Convex Optimization

BPRS and related Bregman splitting algorithms have been applied to:

  • Discrete Optimal Transport (OT): By formulating the OT problem as an inclusion between normal cone operators and affine constraints, BPRS (and BDRS) yield iterative element-wise update schemes that avoid full projection onto the simplex at each iteration. Compared to Sinkhorn–Knopp, BPRS offers computational advantages in entropy-regularized or general Bregman regularized settings. However, the paper notes that in some cases, required assumptions—such as image of h\nabla h^* lying in operator domain—may not strictly hold, so convergence proofs may not directly apply.
  • Structured Convex Optimization: For problems with conic or affinity-type constraints and composite objectives, BPRS delivers a symmetric splitting scheme, inheriting desirable properties such as unconditional convergence under proper parameter regimes.

The algorithmic structure often leads to scalable implementations for networked or distributed environments, semidefinite relaxations, and imaging tasks, given that Bregman geometries can encode both sparsity and smoothness in the underlying solution space.

6. Comparison with Classical Peaceman–Rachford and Douglas–Rachford Methods

When h(x)=12x2h(x) = \frac{1}{2}\|x\|^2, BPRS coincides with classical PRS. In contrast to Douglas–Rachford, Peaceman–Rachford utilizes two unaveraged reflections per step, which can theoretically deliver faster local progress but requires stricter regularity for unconditional convergence. In Bregman settings, both DRS and PRS admit variants as BDRS and BPRS respectively, with the choice of geometry rendering certain problems more tractable or naturally aligned with the data.

The symmetric step structure of BPRS, and its affinity to Bregman ADMM, creates opportunities for constructing algorithms that are highly suited to dual and primal–dual formulations, particularly when dual variables admit Bregman-geometric representation.

7. Summary and Outlook

Bregman Peaceman–Rachford Splitting generalizes classical operator splitting schemes to settings where the quadratic norm is insufficient to capture problem geometry. By composing Bregman reflection operators, BPRS provides a powerful symmetry-based iterative splitting mechanism for monotone inclusions and structured optimization tasks, underpinned by explicit convergence theory for both relatively smooth and nonsmooth cases (Ma et al., 10 Sep 2025). Notably, its equivalence with symmetric Bregman ADMM and its applicability to OT and structured convex programs highlight its versatility.

Key practical guidelines include:

  • Careful selection of Bregman generator hh matched to data and constraints,
  • Adherence to global parameter bounds for step sizes,
  • Verification of regularity and domain requirements when extending to new applications (especially in non-quadratic and entropy-regularized problems).

Ongoing research addresses extensions to non-separable multi-block structures, adaptive parameter tuning, and inexact updating schemes, with particular emphasis on convergence in applied domains where non-Euclidean geometries dominate.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)