Exponential Ergodicity of RLMC
- Exponential ergodicity of RLMC is defined as the uniform, exponential convergence of its chain to the invariant distribution under strong convexity and smoothness conditions.
- The methodology leverages a randomized midpoint approach that reduces drift bias to O(h²), significantly enhancing convergence compared to traditional Langevin algorithms.
- Key theoretical results include explicit contraction rates derived via geometric drift and minorization, offering actionable guidelines for step-size selection and algorithm robustness.
Exponential ergodicity of RLMC refers to the property that the randomized midpoint Langevin Monte Carlo (RLMC) chain, under suitable conditions, converges to its invariant distribution at an exponential rate, uniformly over initial states. This article provides a comprehensive account of definitions, theoretical guarantees, methodologies, explicit convergence rates, and their implications within the wider Markov process literature.
1. Definitions and Theoretical Framework
Exponential ergodicity denotes the existence of constants and such that, for a Markov chain with invariant distribution , the distance between the -step law from any initial and in a suitable norm decays as . For RLMC, the Markov chain is constructed as follows:
Given twice continuously differentiable (with parameters such that for all ), the chain iterates by first drawing and independent Gaussians , computing the "midpoint"
and then setting
The corresponding kernel defines a time-homogeneous Markov chain with generator , which approximates the overdamped Langevin diffusion's generator up to error (Li et al., 17 Nov 2025).
2. Main Exponential Ergodicity Results
The main theorem established for RLMC asserts that, under the strong convexity and smoothness conditions on and for step-size , the RLMC chain is exponentially ergodic. There exist explicit constants , , and Lyapunov function such that
or equivalently for ,
where denotes the weighted total variation (V-norm) distance,
The explicit contraction rate is , and the prefactor is given in terms of drift and minorization constants (Li et al., 17 Nov 2025).
3. Proof Methodology: Drift and Minorization
The exponential ergodicity proof has three main components:
- Geometric Drift Condition: For , the RLMC kernel satisfies with (for small ) and explicit , ensuring the process contracts towards a compact set in expectation.
- Minorization (Small-Set Condition): Every ball is -small for , allowing a uniform lower density on subsets of positive measure.
- Meyn–Tweedie Theorem: Combining drift and minorization yields the existence and uniqueness of an invariant law, with exponential convergence in weighted total variation (Li et al., 17 Nov 2025).
The midpoint in RLMC reduces the discretization bias in the drift to , yielding faster mixing compared to the unadjusted Langevin algorithm (ULA), where the drift bias is only (Li et al., 17 Nov 2025).
4. Explicit Rates and Step Size Constraints
The key contraction parameter
yields the spectral gap of the Markov kernel: for . The prefactor
is determined by the Lyapunov function and small set constants.
For the time-randomized skeleton (continuous-time RLMC), analogous spectral gap arguments give exponential ergodicity with the continuous-time rate matching the underlying diffusion, i.e., contraction at rate under -strong convexity of (Mao et al., 2021).
5. Relationship to General Ergodicity and Markov Process Theory
RLMC inherits its ergodicity properties from general Markov semigroup theory:
- For reversible chains, the exponential ergodicity rate equals the spectral gap (Mao et al., 2021, Guo et al., 2020).
- Uniform ergodicity in total variation is guaranteed provided a geometric drift-minorization pair holds (Li et al., 17 Nov 2025), and the actual convergence rate can be equated to the exponential rate under tight hitting-time control for small sets (Mao et al., 2021).
- In the setting of functional ergodicity (e.g., -norms), the spectral gap criterion is necessary and sufficient for exponential ergodicity when the process is reversible; the same holds as a sufficient condition in the non-reversible case (Guo et al., 2020).
Summary of subordinate results:
| Property | Condition | Rate Description |
|---|---|---|
| Geometric ergodicity | -strong convexity, -smoothness, small | |
| Uniform total variation | Small-set (minorization), drift | |
| -spectral gap | Reversible, Poincaré inequality | |
| Weighted -ergodicity | , spectral gap positive |
The RLMC thus achieves the optimal exponential rate allowed by the diffusion process being discretized.
6. RLMC in Broader Algorithmic Contexts and Infinite Dimensions
RLMC-type models are linked to infinite-dimensional OU processes with cylindrical Lévy noise, where exponential ergodicity has also been established under spectral gap and lower-bound conditions on the noise (Wang, 2015). The RLMC methodology integrates into the general framework by verifying the geometric drift and small-set criteria, which translates to exponential mixing in total variation with explicit rates.
If the generator admits a uniform spectral gap and the noise is sufficiently rich (in the sense of the lower Bernstein function bound), ergodicity extends to infinite-dimensional or randomized linear Markov chain settings (Wang, 2015).
7. Metrics, Extensions, and Applications
Convergence analysis for RLMC is typically presented in weighted total variation, , relative entropy, or -norms. The explicit rates established for the RLMC algorithm give not only theoretical mixing guarantees but inform practical step size selection and algorithmic robustness, especially for strongly convex log-concave targets.
The RLMC framework also provides a foundation for understanding ergodicity in perturbed and reflected variants of Langevin dynamics as encountered in stochastic sampling, statistical physics, and high-dimensional Bayesian inference.
References:
- (Li et al., 17 Nov 2025) "Convergence rate of randomized midpoint Langevin Monte Carlo"
- (Mao et al., 2021) "Convergence Rates in Uniform Ergodicity by Hitting Times and -exponential Convergence Rates"
- (Guo et al., 2020) "Estimate the exponential convergence rate of f-ergodicity via spectral gap"
- (Wang, 2015) "Linear Evolution Equations with Cylindrical Lévy Noise: Gradient Estimates and Exponential Ergodicity"
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days free