Papers
Topics
Authors
Recent
Search
2000 character limit reached

Equivariant Riemannian Stochastic Interpolation

Updated 19 December 2025
  • Equivariant Riemannian stochastic interpolation is a framework that blends Riemannian geometry, group symmetry, and stochastic processes to model data on manifolds and principal bundles.
  • It employs a twisted Polyakov action and generalized diffusion processes to preserve intrinsic metrics and ensure equivariance under group actions.
  • The approach enables effective use of equivariant graph neural networks and stochastic interpolation on Wasserstein and toroidal spaces for enhanced generative modeling and sampling fidelity.

The equivariant Riemannian stochastic interpolation framework systematically connects Riemannian geometry, group symmetry, and stochastic processes to enable rigorous probabilistic modeling and generative flow construction on manifolds and principal bundles. This family of models unifies stochastic interpolation schemes, equivariant flow matching, and message passing via diffusion, delivering significant advantages for data domains with underlying geometric and symmetry constraints—including manifold-valued features, Wasserstein spaces of measures, and periodic or symmetric particle systems. The framework is built on the preservation of intrinsic metrics, exact equivariance under group actions, and compatibility with the differential-geometric structures of principal bundles and associated vector bundles.

1. Coordinate-Independent Feature Fields and Principal Bundle Embedding

The core abstraction is the representation of numerical features as smooth, coordinate-independent feature fields on a Riemannian manifold (M,g)(M, g). These are realized as global sections fΓ(E)f\in\Gamma(E) of an associated vector bundle E=P×ρVE = P \times_{\rho} V constructed from the frame principal bundle π:PM\pi: P \to M and a group representation ρ:GGL(V)\rho: G \to GL(V). The group GG, typically a reductive Lie group, encodes local symmetries such as rotations or gauge transformations; local trivializations differ by transition functions gBA(x)Gg^{BA}(x)\in G on chart overlaps.

Feature fields correspond to equivariant maps h:PVh: P \to V satisfying h(pg)=ρ(g1)h(p)h(p \cdot g) = \rho(g^{-1})h(p) for all pP,gGp \in P, g \in G. This encoding ensures that all numerical processing inherits the geometric and group-theoretic invariance properties of the underlying manifold and principal bundle (Batatia, 2023).

2. Metric Preservation via Twisted Polyakov Action

To guarantee that the embedding of geometric structure into feature space optimally preserves the Riemannian metric on the principal bundle, the framework adopts the minimization of a twisted Polyakov action. The standard Polyakov energy functional for the graph map φh:PP×V,φh(p)=(p,h(p))\varphi_h: P \to P \times V, \varphi_h(p) = (p, h(p)) is

SPolyakov[φh]=Puμν(p)μφhi(p)νφhj(p)vijdVolP(p),S_\text{Polyakov}[\varphi_h] = \int_P u^{\mu\nu}(p) \, \partial_\mu\varphi_h^i(p) \partial_\nu\varphi_h^j(p) v_{ij} \, d\mathrm{Vol}_P(p),

where uu is the G×MG\times M-invariant Riemannian metric on PP and v=uκIVv = u \oplus \kappa I_V is the product metric. Equivariance is enforced by incorporating a Casimir term associated to the group action, resulting in the gauged Polyakov action

Stwist[h]=P[uμνμh,νhV+12Cash,hV]dVolP.S_\text{twist}[h] = \int_P \left[ u^{\mu\nu} \langle \partial_\mu h, \partial_\nu h \rangle_V + \frac{1}{2} \langle \text{Cas} \cdot h, h \rangle_V \right] d\mathrm{Vol}_P.

Here, Cas is the quadratic Casimir operator adρ(Xa)dρ(Xa)\sum_a d\rho(X_a) d\rho(X_a) with {Xa}\{X_a\} an orthonormal basis of the Lie algebra g\mathfrak{g} (Batatia, 2023).

3. Equivariant Diffusion Processes and Discrete Message Passing

Taking variational derivatives yields a generalized Laplace operator on the associated bundle,

th(t,p)=(ΔPIV+IPCas)h(t,p)ΔEh(t,p),\partial_t h(t,p) = -(\Delta^P \otimes I_V + I_P \otimes \text{Cas}) h(t,p) \equiv \Delta^E h(t,p),

where ΔP\Delta^P is the Laplace–Beltrami operator and DαD_\alpha is the covariant derivative from the bundle connection. In manifold coordinates,

ΔEh=[gαβ(x)DαDβh(x)+Cash(x)].\Delta^E h = - [ g^{\alpha\beta}(x) D_\alpha D_\beta h(x) + \text{Cas} \cdot h(x) ].

Time-discretizing the continuous diffusion yields message-passing schemes for graph-structured data sampled from MM. The update for node ii becomes

hi(t+1)=Ut(mi(t)),mi(t)=etCasjN(i)Gρ(g1)kt(ri,rj;g)hj(t)dg,h_i^{(t+1)} = U_t(m_i^{(t)}), \qquad m_i^{(t)} = e^{- t \,\text{Cas}} \sum_{j \in N(i)} \int_G \rho(g^{-1}) k_t(r_i, r_j; g) h_j^{(t)} dg,

for a learned kernel ktk_t implementing metric-aware propagation and respecting GG-equivariance (Batatia, 2023).

4. Stochastic Interpolation Schemes on Wasserstein and Toroidal Spaces

Stochastic interpolation is formalized on smooth Wasserstein spaces PP_\infty over closed Riemannian manifolds, with the diffeomorphism group D\mathscr{D} acting as a principal bundle over PP_\infty via the pushforward map p(φ)=φvolp(\varphi) = \varphi_\sharp \mathrm{vol}. The framework constructs interpolations between endpoint measures μ0,μ1\mu_0, \mu_1 by solving ODEs or SDEs on PP_\infty:

dμt=i=1NZˉi(μt)dWti+Zˉ0(μt)dt,\circ d \mu_t = \sum_{i=1}^N \bar Z_i(\mu_t) \circ dW^i_t + \bar Z_0(\mu_t) dt,

with existence and uniqueness established for smooth vector fields Zˉi\bar Z_i (Martin, 2 Dec 2025). For amorphous particle systems with periodic boundary conditions, interpolation paths leverage the flat torus Td\mathbb T^d, employing geodesic maps and their time derivatives via explicit exponential/logarithm formulas:

IL(t,x0,x1)=expx0(tlogx0(x1)),logx(y)=wrap(yx).I_L(t, x_0, x_1) = \exp_{x_0}(t \log_{x_0}(x_1)),\qquad \log_x(y) = \text{wrap}(y - x).

Velocity fields for ODE flows that match the interpolation law are computed as conditional expectations, yielding a mean-square loss functional:

L(v^)=01EX0,X1[v^(t,IL(t,X0,X1))logX0(X1)2]dt.\mathcal{L}(\hat v) = \int_0^1 \mathbb{E}_{X_0, X_1} \left[ \|\hat v(t, I_L(t, X_0, X_1)) - \log_{X_0}(X_1)\|^2 \right] dt.

This ensures simulation-free training and tractable likelihoods for generative modeling (Grenioux et al., 18 Dec 2025).

5. Enforcement of Group Equivariance and Physical Symmetry

Equivariance under all relevant group actions—including permutations, global translations, and signed axis permutations—is systematically enforced both in the interpolation operator and velocity architecture. For a symmetry group G\mathcal{G}, exact equivariance means

IL(t,g(C0,C1))=gIL(t,C0,C1),v^(t,gC)=Dg(v^(t,C)),I_L(t, g \cdot (C_0, C_1)) = g \cdot I_L(t, C_0, C_1),\qquad \hat v(t, g \cdot C) = Dg(\hat v(t, C)),

where DgDg is the pushforward associated to gGg \in \mathcal{G}. All induced densities and flows remain G\mathcal{G}-invariant under these conditions. The optimal velocity field vv^\ast and flow maps maintain symmetry throughout the interpolation (Grenioux et al., 18 Dec 2025).

6. Equivariant Graph Neural Networks and Algorithmic Realization

The framework instantiates the required equivariance and manifold-awareness in graph neural network (GNN) architectures by integrating geometric operators directly on the manifold or torus. In the particle system setting, the equivariant GNN (EGNN) is adapted to Td\mathbb T^d, with each EGNN layer implementing:

  • Pairwise messaging using nearest-image torus distances,
  • Riemannian updates using the exponential/logarithm maps for positions,
  • Invariance under G\mathcal{G}-actions for both species and spatial coordinates.

The output velocity field is assembled as zero for species features and geometric displacements for particle positions:

v^(t,C)=(0species,{logXi(xiK)}i=1N).\hat v(t, C) = \left(0_\text{species}, \{ \log_{X_i}(x_i^K) \}_{i=1}^N \right).

This architecture ensures exact symmetry preservation at every layer. Automatic differentiation enables exact calculation of divergence terms for likelihood computation (Grenioux et al., 18 Dec 2025).

7. Numerical Results, Practical Guidance, and Extensions

Empirical evaluations on glass-forming particle systems under periodic boundary conditions demonstrate that the equivariant Riemannian stochastic interpolation (eRSI) approach outperforms non-equivariant baselines and non-torus aware models both in equilibrium sample fidelity and in physical observables such as potential energy, specific heat, and radial distribution function. Key metrics include effective sample size (ESS), energy statistics, and structural indicators. eRSI yields unbiased samples with vastly improved efficiency relative to classical sampling methods and non-equivariant models (Grenioux et al., 18 Dec 2025).

Best practices include modeling periodic boundary conditions via flat-torus geometry, enforcing group equivariances by design, leveraging simulation-free objectives for tractable training, and implementing batchwise optimal transport couplings only when symmetry is preserved. The framework generalizes to higher dimensions, multi-component systems, and arbitrary periodic domains.

This suggests broad applicability to generative modeling, geometric deep learning, and probabilistic sampling in domains with nontrivial geometric and symmetry structure. A plausible implication is that further integration with molecular dynamics, sequential tempering along the interpolation path, and adaptive hybrid training will yield advances in sampling efficiency and model fidelity.


Summary Table: Essential Components of Equivariant Riemannian Stochastic Interpolation

Component Description Reference
Feature field representation Sections of associated vector bundles; equivariant maps on principal bundles (Batatia, 2023)
Metric preservation Twisted Polyakov action incorporating the Casimir for equivariance (Batatia, 2023)
Diffusion/message passing Generalized Laplacian and equivariant update kernels; graph discretization (Batatia, 2023, Grenioux et al., 18 Dec 2025)
Group equivariance Enforcement via architecture and objective design (Grenioux et al., 18 Dec 2025)
Stochastic interpolation ODE/SDE paths on Wasserstein/toroidal spaces; flow-matching objectives (Martin, 2 Dec 2025, Grenioux et al., 18 Dec 2025)
EGNN implementation Equivariant layers adapted to manifold geometry and group symmetries (Grenioux et al., 18 Dec 2025)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Equivariant Riemannian Stochastic Interpolation Framework.