Papers
Topics
Authors
Recent
2000 character limit reached

Curvature-Based Rejection Sampling (CURS)

Updated 29 October 2025
  • CURS is an exact Monte Carlo method that employs curvature bounds and rejection sampling to efficiently generate independent samples from probability densities defined by geodesic distances.
  • It utilizes the exponential map and volume comparison techniques to transform geometric constraints into a tractable proposal distribution for sampling on manifolds.
  • CURS offers practical benefits in moderate dimensions, particularly for applications involving symmetric positive definite matrices in statistics and signal processing.

Curvature-Based Rejection Sampling (CURS) is an exact Monte Carlo method for sampling from probability densities on Riemannian manifolds that are functions solely of geodesic distance from a base point, i.e., p(x)f(d(x0,x))p(x) \propto f(d(x_0, x)). CURS leverages the interplay between classical rejection sampling and volume comparison geometry to provide efficient, theoretically exact, and independent sampling in moderate-dimensional geometric contexts, notably in spaces important for statistics and signal processing such as the manifold of symmetric positive definite matrices.

1. Mathematical Formulation and Scope

CURS targets densities on a Riemannian manifold MM of the form

p(x)f(r(x)),r(x)=d(x0,x)p(x) \propto f(r(x)), \quad r(x) = d(x_0, x)

where d(,)d(\cdot,\cdot) is the manifold’s geodesic distance. Examples include the Riemannian Gaussian (p(x)exp[d2(x0,x)/2σ2]p(x) \propto \exp[-d^2(x_0, x)/2\sigma^2]), generalized Gaussian kernels, and other distance-based laws. The method is applicable to any "distance-based" probabilistic model on MM, provided certain geometric and regularity conditions (see Section 4).

The geodesic exponential map Expx0\mathrm{Exp}_{x_0} allows any xx0x \neq x_0 (away from cut locus) to be uniquely specified via (r,s)(r, s), with r>0r>0 and ss a unit vector in Tx0MT_{x_0}M: x=Expx0(rs)x = \mathrm{Exp}_{x_0}(r s) In these coordinates, the Riemannian volume element is

dvolM=detA(r,s)drω(ds)d\mathrm{vol}_M = |\det A(r,s)|\, dr\, \omega(ds)

where A(r,s)A(r,s) evolves according to the Jacobi equation along the geodesic, and ω\omega is the standard spherical measure.

The ultimate goal is independent sampling from the joint law

P(dr×ds)f(r)detA(r,s)drω(ds)P(dr \times ds) \propto f(r)\, |\det A(r, s)|\, dr\, \omega(ds)

where analytic or numerical evaluation of detA(r,s)|\det A(r,s)| is often nontrivial.

2. Algorithmic Construction: Geometric Rejection Sampling

CURS resolves the inhomogeneity of detA(r,s)|\det A(r,s)| using Bishop-Gromov-type volume comparison theorems. If MM has all sectional curvatures κ2\geq -\kappa^2, one has

detA(r,s)(κ1sinh(κr))d1|\det A(r, s)| \leq (\kappa^{-1} \sinh(\kappa r))^{d-1}

for all r,sr,s. This yields a tractable proposal distribution over (r,s)(r, s): g(r,s)f(r)[κ1sinh(κr)]d1g(r,s) \propto f(r)\, [\kappa^{-1}\sinh(\kappa r)]^{d-1} which factorizes into rr (scalar) and ss (spherically uniform) components.

The target density is majorized by g(r,s)g(r,s) with known constant T=Zκ/ZT=Z_\kappa/Z, where

Z=Sx0M0f(r)detA(r,s)drω(ds),Zκ=Ωd10f(r)[κ1sinh(κr)]d1drZ = \int_{S_{x_0}M} \int_0^\infty f(r)\, |\det A(r,s)|\, dr\, \omega(ds), \quad Z_\kappa = \Omega_{d-1} \int_0^\infty f(r) [\kappa^{-1}\sinh(\kappa r)]^{d-1} dr

and sampling follows the standard rejection protocol: accept (r,s)(r, s) with probability detA(r,s)/(κ1sinh(κr))d1|\det A(r, s)| / (\kappa^{-1}\sinh(\kappa r))^{d-1}. The resulting x=Expx0(rs)x = \mathrm{Exp}_{x_0}(r s) yields an i.i.d. sample from the target law, bypassing Markov burn-in and autocorrelation.

Algorithmic Steps

Step Description Formulae/Notes
1 Sample ss \sim uniform on Tx0MT_{x_0}M unit sphere Standard sphere sampling
2 Sample rr from g(r)f(r)[κ1sinh(κr)]d1g(r)\propto f(r)[\kappa^{-1}\sinh(\kappa r)]^{d-1} Univariate log-concave sampler if ff is log-concave
3 Compute detA(r,s)|\det A(r,s)| Analytic in special cases, numeric for general MM
4 Accept (r,s)(r,s) with probability detA(r,s)/(κ1sinh(κr))d1|\det A(r,s)|/(\kappa^{-1}\sinh(\kappa r))^{d-1}
5 Output x=Expx0(rs)x = \mathrm{Exp}_{x_0}(r s)

The acceptance rate is exactly Π=Z/Zκ\Pi = Z/Z_\kappa, computable a priori in low dimensions or tractable with numerical integration.

3. Theoretical Foundation: Volume Comparison and Exactness

The correctness of CURS follows from classic facts in measure theory and differential geometry. The proposal g(r,s)g(r,s) always upper-bounds the target p(r,s)p(r,s), and the acceptance probability is normalized to ensure unbiasedness. The volume comparison bound, central to the method, is sharp in Hadamard manifolds (complete, simply-connected, sec0\mathrm{sec}\leq0), symmetric spaces, and matrix manifolds.

Given these properties, CURS samples are i.i.d., and the method is exact (not asymptotic), conditional on accurate volume computation. This contrasts with MCMC, which entails burn-in and mixing diagnostics, and is not independent sample by construction.

4. Technical Conditions and Practical Regime

CURS is efficient when:

  • The manifold MM admits global geodesic (exponential map) coordinates or can be partitioned into injectivity domains;
  • The volume element detA(r,s)|\det A(r,s)| can be computed (analytically or numerically);
  • The curvature lower bound κ2\kappa^2 is known or available;
  • The target density f(r)f(r) supports efficient univariate rejection or inversion methods.

The method is exponentially sensitive to dimension: expected acceptance rate Π\Pi may decay rapidly with dim(M)\dim(M). It is, therefore, most effective in low to moderate dimensions (e.g., d10d\lesssim 10–$30$ for practical acceptance). For spaces with cut loci, the algorithm can be restricted per direction to remain within the injectivity radius.

CURS achieves high efficiency in problems such as sampling Riemannian Gaussians or other distance-based laws on matrix manifolds (e.g., SPD matrices for covariance modeling, unitary groups for quantum information). In such contexts:

  • For sampling on the SPD manifold of N×NN\times N real covariance matrices, CURS can compute detA(r,s)|\det A(r,s)| explicitly as a product over eigenvalue differences and sines or hyperbolic sines (see Eqn (35) in (Maia et al., 28 Oct 2025)).
  • Example: For N=4N=4, acceptance rates of \sim27% are observed (see Table 2 in (Maia et al., 28 Oct 2025)), corresponding to a several-order-of-magnitude speedup over geometric MCMC methods.

A plausible implication is that, given each sample’s independence and avoidance of autocorrelation, the functional estimation quality per unit computation at moderate dd can be orders of magnitude better than in typical random-walk or geometric MCMC.

CURS generalizes classic manifold rejection sampling by combining geometric volume bounds (Jacobi fields and volume comparison), sound in a broad class of Riemannian and symmetric spaces. Variants of CURS (“sharp CURS”) are introduced that further increase acceptance rates by refining the comparison bound, especially in matrix and group manifold settings.

CURS is not directly applicable to arbitrary, nonsymmetric densities or those dependent on structure beyond geodesic distance. For such cases, methods based on MCMC, integral geometry MCMC, or adaptation of curvature-informed transition rules (see (Sigbeku et al., 2021, Mangoubi et al., 2015)) are preferable.

7. Advantages, Limitations, and Regime of Dominance

Advantages:

  • Each sample is independent and exactly from the target distribution.
  • No burn-in; acceptance rate can be quantified prior to computation.
  • Applicable in any Hadamard or symmetric space with explicit or computable exponential map and Jacobian.

Limitations:

  • Acceptance probability decays exponentially in dimension; CURS is not preferred in high dimensions.
  • The method requires the target law to be a function of distance only.
  • Computation of detA(r,s)|\det A(r,s)| may be numerically intensive for generic MM.

A plausible implication is that in statistical learning, signal processing, or geometry-driven inference scenarios of moderate manifold dimension, CURS can serve as the method of choice for exact and efficient sampling, provided the model and geometric prerequisites are met.


Summary Table: CURS Algorithm Workflow

Step Action Notes
1 Sample ss (direction) uniformly Tangent unit sphere at x0x_0
2 Sample rr from proposal g(r)f(r)(κ1sinh(κr))d1g(r)\propto f(r)(\kappa^{-1}\sinh(\kappa r))^{d-1} Log-concave f(r)f(r) advisable
3 Accept (r,s)(r,s) with probability detA(r,s)/[(κ1sinh(κr))d1]|\det A(r,s)|/\left[(\kappa^{-1}\sinh(\kappa r))^{d-1}\right] Volume comparison from curvature lower bound
4 Output x=Expx0(rs)x=\mathrm{Exp}_{x_0}(r s) i.i.d. sample from target

CURS systematically incorporates differential geometric structure into rejection sampling, providing a theoretically justified and computationally favorable framework for sampling from distance-based laws on Riemannian manifolds, with substantial practical value in moderate-dimensional, geometrically structured data domains (Maia et al., 28 Oct 2025).

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Curvature-Based Rejection Sampling (CURS).