Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 69 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 461 tok/s Pro
Claude Sonnet 4.5 33 tok/s Pro
2000 character limit reached

Riemannian Consistency Model (RCM)

Updated 4 October 2025
  • RCM is a generative modeling framework for Riemannian manifolds, extending consistency models beyond Euclidean spaces to respect intrinsic geometric constraints.
  • RCM employs exponential map-based parameterization and covariant derivatives to ensure on-manifold predictions and accurate vector field comparisons along curved spaces.
  • RCM demonstrates superior few-step sampling performance in experiments on spheres, tori, and SO(3), outperforming naive Euclidean methods and Riemannian Flow Matching.

The Riemannian Consistency Model (RCM) is a generative modeling paradigm designed to enable few-step generation on data defined over Riemannian manifolds, extending consistency model frameworks previously restricted to Euclidean domains. The central challenge addressed by RCM is the preservation of both intrinsic geometric constraints and the correct comparison of vector fields defined over varying tangent spaces, which are necessary for guaranteeing theoretically sound generative processes on curved spaces such as spheres, tori, and Lie groups like SO(3).

1. Geometric Motivation and Scope

RCM arises from the need to move beyond flat-domain generative processes—such as consistency models for images or Euclidean data—into intrinsically non-Euclidean settings (Cheng et al., 1 Oct 2025). Many natural data modalities, including physical measurement locations (e.g., over the Earth’s sphere), protein backbone torsions (on flat tori), or rotation data (SO(3)), fundamentally live on curved manifolds. In these domains, naively applying Euclidean updates fails to respect manifold constraints: outputs drift off-manifold, and vector comparisons (e.g., for residuals or consistency enforcement) become ill-defined due to tangent space misalignment. RCM is constructed to address these issues by:

  • Using exponential map-based parameterizations for denoiser output, ensuring manifold-valued predictions;
  • Incorporating the covariant derivative to compare vector fields intrinsically along PF-ODE sample paths;
  • Generalizing consistency-based objectives to operate with closed-form solutions in both discrete- and continuous-time regimes, facilitating few-step generative sampling.

This enables the extension of the theoretical and practical benefits of consistency models to a broad class of non-Euclidean domains.

2. Mathematical Foundations: Consistency, Covariance, and Exponential Parameterization

Exponential Map-Based Parameterization

RCM leverages the manifold exponential map for generating outputs. Specifically, the denoiser or consistency function at time tt is defined as

fθ(xt,t):=expxt(κtvθ(xt,t)),f_\theta(x_t, t) := \exp_{x_t} ( \kappa_t \cdot v_\theta(x_t, t) ),

where xtx_t is a sample at time tt, vθ(xt,t)v_\theta(x_t, t) is a learned vector field in TxtMT_{x_t} \mathcal{M} (the tangent space at xtx_t), and κt\kappa_t is a time-dependent scaling factor. This guarantees that predicted points remain on the target manifold and that generative kinematics follow geodesics.

Covariant Derivative in Consistency Loss

Because the comparison of vector fields defined at varying points on the manifold requires moving information between tangent spaces, RCM employs the covariant derivative, denoted x˙v\nabla_{\dot{x}} v, to measure how vv changes intrinsically along a trajectory x˙\dot{x}. This is essential for constructing time-derivative operations in curved spaces and for ensuring that the update direction respects parallel transport and local geometry.

Discrete and Continuous-Time Objectives

For discrete-time training, the RCM loss is

LRCMN=N2Et,xt[wtdg2(fθ(xt,t), fθ(xt+Δt,t+Δt))],\mathcal{L}_{\text{RCM}}^N = N^2 \cdot \mathbb{E}_{t, x_t} \left[ w_t \cdot d_g^2 \big( f_\theta(x_t, t),\ f_{\theta^-}(x_{t+\Delta t}, t+\Delta t) \big) \right],

where dg(,)d_g(\cdot, \cdot) is the geodesic distance on the manifold and fθf_{\theta^-} refers to a stop-gradient (teacher) model for stability.

In the continuous-time limit (NN \to \infty), the loss becomes

LRCM=Et,xt[wd(expx)u(κ˙v+κx˙v)+d(expu)x(x˙)g2],\mathcal{L}_{\text{RCM}}^\infty = \mathbb{E}_{t, x_t} \left[ w \cdot \left\| d(\exp_x)_u\big( \dot{\kappa} v + \kappa \nabla_{\dot{x}} v \big) + d(\exp_u)_x (\dot{x}) \right\|_g^2 \right],

with u=κvu = \kappa v and d(expx)ud(\exp_x)_u denoting the derivative of the exponential map at xx applied to uu. This formula correctly propagates both the movement along the manifold and the change in the vector field along the PF-ODE, enforcing consistency in both position and velocity fields.

3. Variants: Distillation, Training, and Theoretical Unification

RCM introduces and theoretically unifies two main variants:

  • Riemannian Consistency Distillation (RCD): The teacher model approximates the marginal vector field of the PF-ODE. The student model is trained to minimize the consistency loss relative to the teacher.
  • Riemannian Consistency Training (RCT): Utilizes the conditional vector field (conditioned on a target data point x1x_1) with an appropriate marginalization trick.

Theoretical analysis (Theorem 3.2 in (Cheng et al., 1 Oct 2025)) establishes that the objectives for RCD and RCT are equivalent under appropriate marginalization and use of the stop-gradient operation. Lemma 3.3 confirms the necessary linearity properties of the differential operators for the equivalence proof.

4. Simplified Objective and Kinematics Perspective

Recognizing the computational complexity of differentiating the exponential map, the authors develop a simplified loss:

LsRCM=Et,xt[wx˙+κ˙v+κx˙vg2].\mathcal{L}_{\text{sRCM}}^\infty = \mathbb{E}_{t, x_t} \left[ w \cdot \left\| \dot{x} + \dot{\kappa} v + \kappa \nabla_{\dot{x}} v \right\|_g^2 \right].

This loss avoids explicit computation of d(expx)ud(\exp_x)_u and d(expu)xd(\exp_u)_x yet preserves the optimality of the original continuous-time objective.

From a kinematics viewpoint, the infinitesimal motion imposed by RCM at each step comprises: the instantaneous error between the predicted and true vector field, the intrinsic derivative of the field along the sample path (adjusted for curvature), and the change in tangent space orientation. The covariant derivative term x˙v\nabla_{\dot{x}} v is essential for preserving geometric fidelity, as even constant vector fields are affected by manifold curvature.

5. Experimental Validation and Empirical Properties

The generative quality of RCM is evaluated through extensive experiments on:

  • Spheres: Earth-related datasets such as earthquake, volcano, and flood locations.
  • Flat tori: Synthetic checkerboard data and biomolecular torsion angles.
  • SO(3): Synthetic multimodal and Swiss roll distributions.

Performance is measured using KL divergence (for spheres) and Maximum Mean Discrepancy (for tori and SO(3)). RCM outperforms both “naive” Euclidean consistency models (which disregard the intrinsic geometry) and Riemannian Flow Matching (RFM) baselines in the few-step regime, particularly when using only two sampling steps. The approach maintains generative quality as manifold dimensionality increases, a property not shared by naive methods.

The experiments consistently use a unified architecture, differing only by input–output dimension to accommodate the underlying manifold. The simplified loss LsRCM\mathcal{L}_{\text{sRCM}}^\infty yields results on par with, or better than, the full closed-form loss.

6. Theoretical Significance and Implications

By formalizing and leveraging the consistency shortcut for few-shot generative sampling, RCM affords the following theoretical advances:

  • Generalization of consistency modeling from Euclidean domains to any Riemannian manifold where the exponential map and covariant derivative are tractable.
  • Theoretical guarantee that both discrete and continuous-time formulations yield the same optimal solution as the number of steps increases.
  • Intrinsic unification of distillation- and training-based consistency models via a marginalization argument and stop-gradient analysis.
  • Provision of a kinematics-based interpretation that clarifies the roles of covariant and extrinsic corrections in vector field propagation.

Such developments set the stage for consistent, geometry-aware generative modeling across a range of non-Euclidean domains.

7. Limitations and Future Directions

Computationally, the primary challenge is efficient evaluation of exponential maps, their differentials, and covariant derivatives, particularly for high-dimensional or complex manifolds. While the simplified loss circumvents some of these burdens, scaling RFM and RCM to very large real-world tasks (e.g., high-resolution protein design, robotics, high-dimensional scientific data) is a prospective area for further research. Future directions include:

  • Automating and optimizing tangent computation;
  • Investigating alternative manifold embeddings and coordinate systems;
  • Integrating RCM and RFM within more general generative frameworks for broader applicability.

Central Mathematical Formulas

Formula Description
fθ(xt,t)=expxt(κtvθ(xt,t))f_\theta(x_t, t) = \exp_{x_t}(\kappa_t v_\theta(x_t, t)) Exponential map-based consistency parameterization
LRCM\mathcal{L}_{\text{RCM}}^\infty Continuous-time loss with all geometric terms
LsRCM\mathcal{L}_{\text{sRCM}}^\infty Simplified loss for computational efficiency

Summary Table: RCM Features and Variants

Aspect Standard Consistency Model Riemannian Consistency Model (RCM)
Domain Euclidean Arbitrary Riemannian manifolds
Parameterization Linear interpolation Exponential map; geodesic propagation
Derivative Standard derivative Covariant derivative (parallel transport)
Discretization Stepwise consistency (ODE approx) Stepwise, geometric consistency via manifold tools
Training Distillation/Conditional RCD and RCT (theoretically unified)
Key Innovation Few-step shortcut via loss Intrinsic loss, closed-form in geometry, kinematic

RCM thus provides a principled, theoretically grounded approach for geometric consistency in generative modeling, allowing for practical, scalable, and high-quality few-step synthesis on Riemannian manifolds (Cheng et al., 1 Oct 2025).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Riemannian Consistency Model (RCM).