Elliptical Slice Sampling
- Elliptical slice sampling is a Markov chain Monte Carlo method that uses elliptical proposals generated from Gaussian priors to sample from arbitrary likelihoods without tuning parameters.
- It leverages the geometric structure of the Gaussian prior to efficiently traverse high-dimensional, correlated, and constrained posterior spaces, ensuring robust mixing and convergence.
- Extensions including adaptive, parallel, geodesic, and transport-based variants enhance its versatility for non-Gaussian and infinite-dimensional problems, with theoretical guarantees on ergodicity and spectral gaps.
Elliptical slice sampling is a Markov chain Monte Carlo (MCMC) method designed for efficient, tuning-free sampling from posterior distributions defined by Gaussian priors and arbitrary likelihoods, with applicability to both finite and infinite-dimensional problems. It exploits the geometry of the Gaussian prior to propose new states along ellipses, allowing efficient exploration of complex, correlated, and sometimes constrained posteriors. Its theoretical and algorithmic foundations have inspired a broad family of methods, including adaptive, regionally-structured, geodesic, and transformation-based generalizations.
1. Algorithmic Foundations
Elliptical slice sampling generates proposals by traversing an ellipse defined by the current state and an independent sample from the Gaussian prior . For the posterior target
with and arbitrary likelihood , the update proceeds by:
- Drawing an auxiliary variable (in finite or infinite dimensions).
- Proposing candidates of the form for .
- Introducing an auxiliary "slice" variable .
- Searching for such that via an angular shrinkage procedure.
This process requires no step-size tuning and leaves the target invariant. In infinite-dimensional separable Hilbert spaces, this construction holds provided the likelihood is lower semicontinuous, ensuring well-definedness and almost sure termination of the shrinkage loop (Hasenpflug et al., 2023).
2. Theoretical Properties: Ergodicity, Reversibility, and Operator Structure
Geometric ergodicity of elliptical slice sampling has been established under mild regularity conditions on the likelihood, specifically that is bounded above and below on compacts and that level sets contain Euclidean balls asymptotically (Natarovskii et al., 2021). For all and all , the convergence in total variation is quantified as
for universal constants , . Empirical results indicate that the effective sample size and mixing times of elliptical slice sampling are largely independent of dimension, outperforming random-walk Metropolis and coordinate-wise slice samplers, especially in high dimensions and for strongly correlated Gaussian processes.
The transition kernel is reversible with respect to ; this is established via a representation of the shrinkage procedure as a reversible Markov chain on the circle for the angular variable, which lifts to the full kernel (Hasenpflug et al., 2023). The associated Markov operator is positive semidefinite, enabling spectral analysis and guaranteeing nonnegative spectrum and convergence bounds.
Furthermore, when implemented as a hybrid slice sampler (i.e., using approximate updates on projected slices), the spectral gap is quantitatively preserved; under rapid convergence of the inner chain, the overall spectral gap remains close to that of the ideal simple slice sampler (Łatuszyński et al., 2014).
3. Algorithmic Extensions and Generalizations
Several directions of generalization and algorithmic refinement have been pursued:
- Parallel and Generalized Elliptical Slice Sampling (GESS): The GESS framework replaces the fixed Gaussian prior with a scale-location mixture (notably, a multivariate Student's ) and executes parallel chains whose updates share global information via fit mixture parameters. The proposal is , with drawn from the approximating mixture. The auxiliary scale parameter is marginalized using an inverse-gamma conditional. The resulting method is rejection-free, highly parallelizable, and enables robust, rapid mixing for heavy-tailed or non-Gaussian targets (Nishihara et al., 2012).
- Regional Pseudo-Priors: RGESS partitions the parameter space into regions, each with its own fitted pseudo-prior (using a mixture of Gaussians or distributions), periodically updated via EM, VI, or stochastic approximation. Proposals employ the appropriate regional pseudo-prior and acceptance rules, ensuring ergodicity, model-averaging in unimodal regimes, and better mode discovery in multimodal settings (Li et al., 2019).
- Truncated and Constrained Domains: For linearly truncated (polytope-constrained) multivariate normals, elliptical slice sampling can be adapted via an algorithm that computes the angular intervals where the proposed ellipse remains inside the polytope. Each linear constraint is converted to an angular interval; sorting and cumulative maxima yield the active set of feasible angles, enabling efficient, numerically robust, and rejection-free sampling with vectorizable implementation (Wu et al., 15 Jul 2024).
- Probabilistic Model Checking and Trajectory Constraints: In probabilistic verification of dynamic systems under Signal Temporal Logic (STL), elliptical slice sampling efficiently draws trajectories from high-dimensional Gaussians truncated by STL requirements. Hyperplane intersections with the ellipse define feasible angle segments, and multi-level splitting strategies further enable rare-event probability estimation (Scher et al., 2022).
- Transport-based and Flow-based Methods: Transport Elliptical Slice Sampling (TESS) applies normalizing flows to "Gaussianize" the target distribution, performs elliptical slice sampling in the transformed space, and maps accepted samples back using the flow. This approach enhances efficiency and mixing in strongly non-Gaussian geometries, with high computational throughput via parallel chains on modern hardware (Cabezas et al., 2022).
- Geodesic and Manifold Extensions: The geodesic slice sampler extends the proposal mechanism to geodesics on Riemannian manifolds, replacing straight-line or elliptical movement with numerically integrated geodesics determined by a metric . This generalization substantially improves exploration in the presence of strong curvature and multimodality, effectively reparameterizing the space so that modes or "ridges" are more accessible (Williams et al., 28 Feb 2025).
4. Adaptive and Gradient-Informed Variants
While standard elliptical slice sampling does not use gradient information, several related frameworks exploit local curvature and gradient data to adapt proposals:
- Covariance-Adaptive Slice Sampling: Adaptive schemes using the "crumb framework" iteratively build a Gaussian proposal distribution by drawing auxiliary crumbs and updating their covariance structure based on gradients at rejected proposals. The Gaussian is updated via a rank-one adjustment in the gradient direction to match local slice geometry, with an explicit parabolic fit to log-density available for curvature estimation. Such methods outperform non-adaptive slice samplers and Metropolis schemes in highly correlated settings, though they are most efficient in moderate dimensions (Thompson et al., 2010).
- Shrinking-Rank Adaptive Slice Sampling: The proposal subspace is shrunk orthogonally to gradient directions encountered at rejected proposals, aligning the remaining proposal geometry with the longest axis of the slice and reducing autocorrelation. The method dynamically constructs a lower-rank Gaussian approximation in an adaptively determined subspace, achieving high efficiency for strongly correlated targets (Thompson et al., 2010).
- Hamiltonian and Monomial-Gamma Connections: From a Hamilton-Jacobi perspective, elliptical slice sampling with uniform angle selection is formally equivalent to a special case of Hamiltonian Monte Carlo with a monomial kinetic energy (specifically, with parameter ). More general kinetic energies can in principle interpolate between HMC and slice sampling, though large improves mixing at the cost of greater numerical integration error and practical tuning challenges (Zhang et al., 2016).
5. Comparative Performance, Applications, and Limitations
Elliptical slice sampling achieves dimension-independent mixing rates for Gaussian and many non-Gaussian targets, as evidenced by empirical comparisons with Metropolis-Hastings, preconditioned Crank–Nicolson, and coordinate or random-direction slice samplers (Natarovskii et al., 2021). Its robust mixing and suitability for highly correlated posteriors make it a default choice in large-scale latent Gaussian models, Gaussian process regression, and high-dimensional trajectory inference.
For constrained and truncated distributions, the method remains rejection-free and numerically stable when equipped with precise, low-complexity ellipse-polytope intersection routines (Wu et al., 15 Jul 2024). Its non-reliance on step-size or scaling parameters stands in contrast to adaptive Metropolis or Hamiltonian Monte Carlo. In strongly multimodal or highly non-Gaussian cases, regionally-structured, flow-based, or geodesic extensions provide additional robustness.
Limitations arise in cases where the prior is poorly matched to posterior geometry, or when non-Gaussian heavy tails and sharp modes outpace the mixing capacity of fixed-ellipse proposals. Augmenting ESS with adaptive, gradient, or transport-based mechanisms can mitigate these challenges at increased computational cost. For ill-conditioned or highly skewed posteriors, pre-processing transformations or pseudo-prior adaptations may be essential for efficient inference.
6. Connections with Related Methods
Elliptical slice sampling is closely related to, but distinct from, several families of MCMC techniques:
- Hybrid and Hit-and-Run Slice Sampling: By interpreting angular movements on the ellipse as uniform sampling on one-dimensional slices, ESS has connections to hit-and-run and hybrid slice samplers. Rapid mixing on each angular segment ensures the hybrid chain inherits ergodicity and the spectral gap of the ideal sampler (Łatuszyński et al., 2014).
- Gibbsian Polar and Manifold-Based Slice Samplers: Decomposing updates into radial and angular components, or traversing geodesics on a manifold, generalizes the ESS proposal to better exploit heavy tails and curved geometries (Schär et al., 2023, Williams et al., 28 Feb 2025).
- Pseudo-Prior and Quantile Slice Sampling: Pseudo-prior constructions (including regionally-adapted mixtures) and probability-integral-transform-based methods (e.g., quantile slice sampling) can be viewed as extensions that parameterize the ESS update with respect to an approximate prior or subspace decomposition, often to boost mixing and reduce rejection rates (Li et al., 2019, Heiner et al., 17 Jul 2024).
7. Implementation, Extensions, and Resources
Elliptical slice sampling and its variants have been implemented in several open-source statistical packages (R, Python) and are frequently adopted in high-dimensional Bayesian computation. New implementations emphasize parallelism, GPU acceleration, and robust constraint handling. In applied contexts, the sampler is found in Bayesian nonparametric models, spatial statistics, and probabilistic systems verification.
Recent research continues to extend the method to infinite-dimensional problems, Bayesian inverse problems, and models with intricate curvature or multimodal structure. The reversibility, positive-semidefinite Markov operator property, and explicit spectral gap guarantee provide theoretical assurance for practitioners, supporting both empirical performance and rigorous error bounds.
In summary, elliptical slice sampling occupies a central role in modern Bayesian computation, offering a theoretically sound, tuning-free, and efficient approach for sampling in complex multivariate and infinite-dimensional spaces, with a broad landscape of adaptive, transport-based, and geometry-aware extensions supporting applications from signal processing to spatiotemporal modeling.