Papers
Topics
Authors
Recent
Search
2000 character limit reached

Nonlinear Additive Schwarz Method

Updated 6 February 2026
  • Nonlinear Additive Schwarz is a domain decomposition method that partitions a nonlinear problem into overlapping subdomains for parallel local solves.
  • It enables the construction of robust nonlinear preconditioners such as RASPEN and ASPIN, accelerating Newton and quasi-Newton convergence for complex PDEs.
  • Scalable implementations with coarse-level corrections ensure mesh-independent convergence and practical performance for large-scale nonlinear optimization problems.

The nonlinear Additive Schwarz method is a domain decomposition paradigm for solving large-scale nonlinear boundary value problems and optimization problems, generalizing the classical Additive Schwarz technique into the nonlinear and nonconvex regime. This framework underpins a range of nonlinear preconditioners, solver algorithms, and scalable iterative schemes, prominently including RASPEN (Restricted Additive Schwarz Preconditioned Exact Newton), ASPIN (Additive Schwarz Preconditioned Inexact Newton), and their substructured and coarse-grid-augmented variants. These methods exploit the structure of the nonlinear problem and the locality of nonlinear physics or optimization constraints to achieve parallel scalability, sharp convergence, and robust preconditioning properties for Newton-type and quasi-Newton-type outer iterations. The following sections present the mathematical foundations, algorithmic realizations, convergence theory, and practical implications for the nonlinear Additive Schwarz method as documented in the contemporary literature (Dolean et al., 2016, Park, 2023, Kothari, 2022, Chaouqui et al., 2021, Kothari et al., 2022, Park, 2020, Park, 2023, Park, 2019).

1. Mathematical and Algorithmic Foundations

Let F(u)=0F(u)=0, uVu \in V be a nonlinear operator equation on a Hilbert or Banach space, typically arising from the discretization of a nonlinear PDE or convex variational problem. The domain is covered by II overlapping subdomains with associated restriction Ri:VViR_i: V \to V_i and prolongation Pi:ViVP_i: V_i \to V operators; restricted prolongations P~i\tilde P_i enforce the partition-of-unity i=1IP~iRi=IV\sum_{i=1}^I \tilde P_i R_i = I_V.

For each uVu \in V, the subdomain-local nonlinear Dirichlet (or Robin/optimized) problem is defined: RiF(PiGi(u)+(IPiRi)u)=0R_i F \big( P_i G_i(u) + (I - P_i R_i)u \big) = 0 where Gi(u)G_i(u) is the local solution. The parallel nonlinear additive Schwarz iteration is

un=i=1IP~iGi(un1)G1(un1)u^n = \sum_{i=1}^I \tilde P_i G_i(u^{n-1}) \equiv \mathcal{G}_1(u^{n-1})

yielding the fixed-point (preconditioned) equation: J1(u):=G1(u)u=0\mathcal{J}_1(u) := \mathcal{G}_1(u) - u = 0 The subdomain correction form is Ci(u)=Gi(u)RiuC_i(u) = G_i(u) - R_i u, so J1(u)=i=1IP~iCi(u)\mathcal{J}_1(u) = \sum_{i=1}^I \tilde P_i C_i(u).

This iteration is the basis for nonlinear stationary solvers, nonlinear preconditioners, and the construction of exact or inexact Newton/Krylov solvers, with the local nonlinear problems forming the heart of each parallel Schwarz sweep (Dolean et al., 2016, Kothari, 2022).

2. Nonlinear Preconditioning: RASPEN, ASPIN, and Extensions

Nonlinear Additive Schwarz formulations enable two principal preconditioning paradigms:

  • RASPEN (“Restricted Additive Schwarz Preconditioned Exact Newton”): Newton’s method is applied to the nonlinear fixed-point preconditioned equation J1(u)=0\mathcal{J}_1(u) = 0, using the exact Jacobian

J1(u)=i=1IP~idGidu(u)I=i=1IP~i(RiJ(u(i))Pi)1RiJ(u(i))J_1(u) = \sum_{i=1}^I \tilde P_i \frac{dG_i}{du}(u) - I = -\sum_{i=1}^I \tilde P_i (R_i J(u^{(i)})P_i)^{-1} R_i J(u^{(i)})

with u(i)=PiGi(u)+(IPiRi)uu^{(i)}=P_i G_i(u)+(I−P_i R_i)u. The Newton–Krylov update is then

J1(uk)δk=J1(uk),uk+1=uk+δkJ_1(u^k)\delta^k = -\mathcal{J}_1(u^k),\quad u^{k+1}=u^k+\delta^k

Exploiting that the nonlinearly preconditioned residual operator arises from a convergent fixed-point map, RASPEN achieves a clustered Jacobian spectrum and rapid outer Newton convergence.

  • ASPIN (“Additive Schwarz Preconditioned Inexact Newton”): The preconditioned function is

JASPIN(u)=i=1IPiCi(u)\mathcal{J}^{ASPIN}(u) = \sum_{i=1}^I P_i C_i(u)

with an approximate Jacobian (ignoring the u(i)u^{(i)} dependency)

JASPINinexact(u)=(iPi(RiJ(u)Pi)1Ri)J(u)J_{ASPIN}^{\rm inexact}(u) = -\left( \sum_i P_i (R_i J(u)P_i)^{-1} R_i \right) J(u)

ASPIN’s fixed-point iteration generally diverges without relaxation, and it is less effective as a nonlinear preconditioner, especially without coarse-level correction.

Two-level versions of these preconditioners employ coarse spaces and FAS-type corrections to restore scalability for large numbers of subdomains, with the two-level nonlinear fixed-point map given by

un+1=un+P0C0(un)+i=1IP~iCi(un+P0C0(un))u^{n+1} = u^n + P_0 C_0(u^n) + \sum_{i=1}^I \tilde P_i C_i(u^n + P_0 C_0(u^n))

and Newton’s method applied to the corresponding nonlinear system (Dolean et al., 2016, Park, 2023).

Substructured variants (e.g., SRASPEN) restrict all operations to interface (“skeleton”) degrees of freedom, reducing memory and computational cost by operating solely on traces (Chaouqui et al., 2021).

3. Theoretical Analysis: Convergence, Scalability, and Robustness

Additive Schwarz domain decomposition in the nonlinear regime exhibits rigorous convergence properties under standard regularity assumptions—Lipschitz continuous Jacobian FF', strong monotonicity or local coercivity, and stable, unique local subdomain solves GiG_i (Dolean et al., 2016, Park, 2023). Core results include:

  • Local Contraction: The one-level nonlinear Schwarz map is a contraction near the solution uu^*, with the contraction factor determined by the maximal subdomain interface error transfer.
  • Preconditioned Newton Quadratic Convergence: The spectrum of the preconditioned Jacobian J1(u)J_1(u^*) is clustered, ensuring quadratic convergence of Newton’s method.
  • Two-Level Scalability: Provided the coarse correction approximates global low-frequency error, two-level methods’ convergence factors depend only on geometry parameters such as H/hH/h and H/δH/\delta (HH = subdomain size, hh = element size, δ\delta = overlap width), and not on the nonlinearity strength or structure (Park, 2023, Park, 2023).
  • Nonlinearity-Independence: For semilinear elliptic problems with convex energy, the number of global nonlinear iterations is insensitive to the nonlinearity; coarse-grid augmented Schwarz achieves mesh-independent convergence (Park, 2023).

Convergence for optimization problems (including those with bound constraints) and nonsmooth settings (e.g., L1L^1 penalties, variational inequalities) is established via convex splitting, subspace decomposition theorems, and abstract additive Schwarz lemmas, yielding either linear or sublinear global contraction rates (Park, 2023, Park, 2019, Park, 2023).

4. Nonlinear Schwarz Preconditioners in Newton and Quasi-Newton Methods

Nonlinear Additive Schwarz can be embedded in various algorithmic outer iterations:

  • Full Newton–Krylov: RASPEN and its relatives provide highly effective nonlinear preconditioners for global Newton–Krylov solvers. Each Newton step involves assembling the global preconditioned residual, applying the (exact or approximate) Jacobian via local subdomain Newton solves, and updating the global iterate.
  • Quasi-Newton/Accelerated Methods: The left- or right-preconditioned Schwarz operator replaces the classical gradient in quasi-Newton (QN) or Anderson acceleration (AA) frameworks, with preconditioned secant pairs defined in the space of preconditioned residuals to ensure superlinear local convergence (Kothari, 2022). For bound-constrained Newton–SQP, the NRAS-B preconditioner serves as a right-preconditioner for the first-order optimality system, enhanced by coarse spaces that are solution-dependent (Kothari et al., 2022).
  • Gradient/Projected Gradient Regimes: For convex optimization and variational inequalities, the additive Schwarz iteration can be regarded as a block gradient step in a non-Euclidean metric, enabling the use of acceleration (e.g., FISTA-type momentum and adaptive restart) for improved convergence (Park, 2020, Park, 2019).

The precise formulation, construction of preconditioned secant pairs, and handling of bound/nonsmooth constraints are essential to retain the superlinear convergence properties and global robustness of the QN and Newton outer methods (Kothari, 2022, Kothari et al., 2022).

5. Scalability, Coarse-Level Correction, and Practical Implementation

Scalability—insensitivity of solver iteration counts to the number of subdomains, mesh parameters, or overlap width—is achieved by augmenting the additive Schwarz framework with a coarse-level correction, typically implemented via full approximation scheme (FAS) or solution-adaptive coarse interpolation:

  • Coarse Spaces: Constructed via finite element subspaces, positivity-preserving or bound-compatible interpolation operators, or solution-dependent feasible sets (especially for bound-inequality constraints), with prolongation/restriction tailored to preserve key problem structure (Dolean et al., 2016, Kothari et al., 2022, Park, 2023).
  • Two-Level Algorithms: The subdomain and coarse solves are performed sequentially or multiplicatively per outer iteration, with the coarse-step preceding the subdomain sweep.
  • Parallelism: Subdomain nonlinear solves are fully independent, with global communication limited, in the overlapping case, to interface values or, in the substructured/skeleton variant, to trace data. Coarse-level solves remain of relatively low dimension.
  • Inexact Local Solvers and Tolerance Balancing: Empirical studies confirm that the global convergence and robustness are not significantly degraded by using inexact, efficiently computed local nonlinear solves, provided tolerances are balanced.

Pseudocode for one-level and two-level RASPEN, NRAS-B, and their substructured variants are explicit, with parallel for-loops for local subdomain nonlinear Dirichlet/Robin solves, global residual assembly, preconditioned Jacobian formation, Newton/Krylov step, and iterate update (Dolean et al., 2016, Kothari et al., 2022, Chaouqui et al., 2021).

6. Applications and Numerical Performance

Nonlinear Additive Schwarz methods have been applied to semilinear and fully nonlinear elliptic equations (including ss-Laplacians, Forchheimer models, and nonlinear Poisson–Boltzmann), variational inequalities (including fourth-order), and PDE-constrained optimization with nonsmooth (e.g., L1L^1) penalties and bound constraints.

Representative performance data (Dolean et al., 2016, Park, 2023, Kothari, 2022, Kothari et al., 2022, Park, 2023):

Problem/Class Method Outer Its Total Subdomain Solves Mesh Dependence/Scalability
1D Forchheimer, 8 subdomains Newton ~12 ~400 Divergent for ASPIN without relax
RASPEN (1L) ~7 ~150 Scalable with coarse grid
ASPIN (1L) diverge - Diverges if not relaxed
RASPEN (2L) ~8 ~260 Iter count mesh/overlap-independent
2D Nonlinear Poisson, 16216^2 sub RASPEN (1L) 3 ~400
ASPIN (1L) 3–4 ~800
RASPEN (2L) 3 ~100
Semilinear elliptic, n=232n=2\ldots 32 TL-NRAS-B 5–8 Iteration count nn-independent (2L)
Fourth-order VI, H/h=23H/h=2^3 ASM (2L) Linear convergence, mesh-independent

The methods exhibit weak or strong scalability: the two-level preconditioned outer-iteration counts remain essentially constant as mesh or subdomains are refined. In nonsmooth or highly nonlinear cases, nonlinear Schwarz methods preconditioned Newton or QN can reduce iteration counts by an order of magnitude relative to non-preconditioned approaches (Kothari et al., 2022, Kothari, 2022).

7. Recent Extensions and Future Directions

Recent developments include:

  • Substructured and skeleton-based methods: Restricting the Schwarz iteration to interface (trace) unknowns reduces memory and computational cost, with proven equivalence between full-space and substructured iterations under compatibility conditions (Chaouqui et al., 2021).
  • Accelerated, momentum, and restarted Schwarz: Incorporating FISTA-like acceleration, adaptive restart, and non-Euclidean gradient steps yields up to an order-of-magnitude acceleration in convergence for both smooth and nonsmooth problems (Park, 2020).
  • Optimized and nonsmooth preconditioners: Extensions include nonlinear optimized Schwarz with Robin coupling, semismooth analysis, and robust handling of non-differentiable and box-constrained terms (Ciaramella et al., 2021).
  • Application to fourth-order and variational inequality constraints: Two-level positivity-preserving coarse interpolants yield scalable convergence for fourth-order PDEs and variational inequalities (Park, 2023).
  • Robustness for nonconvex and highly nonlinear regimes: Multilevel nonlinear Schwarz preconditioners increase the Newton–Krylov basin of attraction, improving global convergence and Newton robustness in challenging settings (Dolean et al., 2016, Aiton et al., 2019).

These directions collectively frame the nonlinear Additive Schwarz method as a central primitive for scalable, robust solution strategies for broad classes of discretized nonlinear PDEs, variational inequalities, and PDE-constrained optimization problems.

Topic to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Nonlinear Additive Schwarz Method.