Skew-Reflected Non-Reversible Langevin Dynamics
- SRNLD is a stochastic process that enhances sampling from constrained domains by combining non-reversible antisymmetric drift with skewed boundary reflections.
- It rotates the gradient flow using a skew-symmetric matrix to accelerate convergence and reduce autocorrelation while preserving the invariant measure.
- SRNLD offers rigorous convergence guarantees and has shown improved performance in applications such as Bayesian regression and high-dimensional classification.
Skew-Reflected Non-Reversible Langevin Dynamics (SRNLD) is a class of stochastic processes designed for efficient sampling from target distributions supported on constrained domains. SRNLD generalizes classical reversible reflected Langevin dynamics by introducing non-reversibility through antisymmetric drift and constructing a "skew" boundary reflection mechanism. This combination yields accelerated convergence toward equilibrium and sharper concentration of the process around its invariant measure, especially in high-dimensional or complex constrained settings.
1. Mathematical Structure and Motivation
SRNLD aims to sample from a probability density defined on a constrained domain , where is smooth. The continuous-time dynamics are given by the skew-reflected stochastic differential equation: where
- is a skew-symmetric matrix field (),
- is a standard Brownian motion,
- adjusts the normal vector at the boundary to ensure the process remains in ,
- is a measure increasing only when , i.e., local time on the boundary,
- is the inward unit normal at .
The innovation in SRNLD is the coordinated design of the antisymmetric drift and the "skew" boundary reflection to maintain the desired invariant measure while breaking detailed balance and improving ergodic properties.
2. Non-Reversibility and Skew Reflection Mechanism
The non-reversible drift is introduced via the skew-symmetric , resulting in a drift vector
which rotates the gradient flow. This non-gradient perturbation induces probability currents within , breaking reversibility (detailed balance), a property which—when properly constructed—does not alter the invariant measure but accelerates mixing and reduces sampling autocorrelation.
The boundary reflection is "skew" in the sense that, when the process hits , it is reflected in the direction , not simply . This ensures compatibility with the non-reversible interior drift and maintains the target density as invariant for the process. The mathematical justification and well-posedness are established via an associated Skorokhod problem with oblique (skewed) reflection; under convexity and regularity conditions on and smoothness of , existence and uniqueness hold.
3. Boundary Conditions and Design of the Skew-Symmetric Matrix
The construction of an admissible matrix field is crucial. Analysis shows that to retain the invariant measure and guarantee well-posedness, it is necessary that
This requirement ensures the skew reflection does not "push" the process outside the domain and allows the boundary condition for the generator to reduce to a Neumann-type form, facilitating both theoretical analysis and implementation.
Within the interior , further divergence-free constraints () may be imposed for the process to admit the correct stationary law.
Construction examples include:
- For the unit ball, may be constructed via a block-diagonal or cross-product form ensuring for .
- For general convex domains, is built to annihilate the local normal direction at the boundary.
The practical construction of efficient is informed by domain geometry and the need to maintain ergodicity and reflection consistency.
4. Convergence Guarantees and Large Deviations Theory
SRNLD achieves strictly faster convergence to the target law compared to its reversible counterpart. Let denote the spectral gap associated with the generator of SRNLD, and that of the reversible RLD. Then,
implying exponential convergence in total variation and $1$-Wasserstein distance: where bounds .
Further, a large deviation principle (LDP) for the empirical measures of SRNLD reveals enhanced concentration relative to the reversible process. The LDP rate function decomposes as: with:
- corresponding to the symmetric (reversible) part,
- a non-negative contribution due to the antisymmetric drift, where .
Since , large deviations are rarer in SRNLD than in RLD—all else equal—quantifying a fundamental acceleration in sampling.
5. Discretization: Skew-Reflected Non-Reversible Langevin Monte Carlo
To enable implementation, SRNLD can be discretized using Euler–Maruyama-type schemes. The Skew-Reflected Non-Reversible Langevin Monte Carlo (SRNLMC) update is: where:
- ,
- is the skew projection along the appropriately modified normal.
Non-asymptotic discretization error bounds are established. With step size and steps, the overall sampling error in $1$-Wasserstein distance is controlled as
where the iteration complexity improves as increases with better design of .
6. Empirical Performance and Applications
Experimental studies confirm that SRNLMC with properly constructed outperforms projected/reversible Langevin algorithms on a variety of sampling tasks including:
- Sampling truncated multivariate normals in balls and cubes,
- Bayesian linear and logistic regression with norm constraints,
- Real-world data classification tasks with high-dimensional parameter constraints.
Performance metrics—such as convergence in Wasserstein distance, mean squared error in posterior estimation, and classification accuracy—consistently validate the theoretical predictions. The advantage is most pronounced when SRNLMC utilizes a state-dependent structured to vanish along the normal at the boundary, which both satisfies the theoretical requirements and yields robust and stable practical behavior.
A summary table highlighting the principal distinctions:
Algorithm | Boundary Reflection | Reversibility | Convergence Rate | Boundary Condition for |
---|---|---|---|---|
PLMC | Standard (Neumann) | Reversible | none | |
SRNLMC | Skew (oblique) | Non-reversible | on |
7. Theoretical Significance and Future Directions
SRNLD connects and generalizes key principles in modern sampling theory:
- It leverages the acceleration properties of non-reversible dynamics known from unconstrained settings,
- Transfers these gains to constrained domains via careful modification of both drift and boundary interaction,
- Is grounded in large deviations and spectral theory, providing rigorous quantification of sampling acceleration,
- Guides practical algorithm design through precise conditions on the antisymmetric matrix.
Open questions include optimal construction of for more general or non-convex sets, extensions to additional constraint types, and integration into large-scale machine learning workflows.
References:
- (1103.2845) describes partially elastic reflected Langevin processes and their connection to skew reflections and non-reversibility.
- (2501.11743, 2506.07816) provide mathematical formulation, convergence analysis, and large deviations results for SRNLD on convex constrained domains, including practical construction and performance benchmarks.
- Additional context on theory and methods is drawn from the cited works in each original abstract.