Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generalised Flow Maps: Unified Transport Modeling

Updated 27 February 2026
  • Generalised Flow Maps are unified mathematical frameworks that extend classical flow maps by incorporating stochastic dynamics and geometric constraints on manifolds.
  • They employ two-parameter families and self-distillation training strategies to efficiently model, simulate, and learn transport phenomena in varied spaces.
  • Empirical benchmarks demonstrate state-of-the-art performance in applications like protein torsion synthesis and wireless channel mapping with significantly reduced computational steps.

Generalised Flow Maps (GFMs) provide a unified mathematical and algorithmic formalism for modeling, simulating, and learning transport phenomena and probabilistic flows in both Euclidean and non-Euclidean (Riemannian and Lie group) spaces. The concept extends classical flow maps—deterministic solutions of ODEs describing transport of mass, probability, or state—by incorporating stochasticity, geometric constraints, and few-step generative modeling tools suitable for modern applications such as geometric deep learning, generative modeling, turbulence, and wireless channel map synthesis.

1. Mathematical Foundations and Definitions

Generalised Flow Maps are two-parameter families of maps Xs,t:MMX_{s,t}: M \to M on a manifold (M,g)(M,g), corresponding to the transport of points or particles from time ss to time tt under a generative probability flow. Let ρ0,ρ1\rho_0, \rho_1 be probability densities on MM. The map Xs,tX_{s,t} is uniquely defined such that, for every solution curve xtx_t of the Riemannian probability-flow ODE

txt=vt(xt),tρt(x)=divg(ρtvt)(x),\partial_t x_t = v_t(x_t), \qquad \partial_t \rho_t(x) = -\mathrm{div}_g(\rho_t v_t)(x),

with x0ρ0x_0 \sim \rho_0 and ρ1\rho_1 the target, one has Xs,t(xs)=xtX_{s,t}(x_s) = x_t for all 0st10 \le s \le t \le 1 (Davis et al., 24 Oct 2025).

On a Riemannian manifold, the interpolating curves between endpoints are geodesics: It(x0,x1)=expx0(αtlogx0(x1))I_t(x_0, x_1) = \mathrm{exp}_{x_0}\left( \alpha_t \cdot \log_{x_0}(x_1) \right), α0=0,α1=1\alpha_0 = 0, \alpha_1 = 1. In Euclidean space, this reduces to a linear segment. Parametric representations use the exponential map: Xs,t(x)=expx((ts)vs,t(x))X_{s,t}(x) = \mathrm{exp}_x\left( (t-s) v_{s,t}(x) \right), where vs,t(x)v_{s,t}(x) is a learnable vector field.

In the context of turbulence, the notion of generalised flows refers to probability measures on sets of Lagrangian trajectories, extending Arnold's deterministic Lagrangian flow to a probabilistic setting (Thalabard et al., 2018).

2. Theoretical Properties and Equivalent Characterisations

Generalised Flow Maps possess several equivalent characterisations on a manifold (M,g)(M,g):

  • Generalised Lagrangian: tXs,t(x)=vt(Xs,t(x))\partial_t X_{s,t}(x) = v_t(X_{s,t}(x)).
  • Generalised Eulerian: sXs,t(x)+d(Xs,t)x[vs(x)]=0\partial_s X_{s,t}(x) + d(X_{s,t})_x[v_s(x)] = 0, replacing ambient-space gradients by manifold differentials.
  • Semigroup: Xu,t(Xs,u(x))=Xs,t(x)X_{u,t}(X_{s,u}(x)) = X_{s,t}(x), reflecting compositional consistency.
  • Tangent Condition: limsttXs,t(x)=vt(x)\lim_{s \to t} \partial_t X_{s,t}(x) = v_t(x), ensuring correspondence with the instantaneous vector field on the diagonal.

These equivalences ensure that GFMs unify a broad class of transport and generative modeling frameworks, including consistency models, shortcut models, and mean flows, with the Riemannian manifold setting generalising addition to the exponential map and gradients to differentials (Davis et al., 24 Oct 2025).

3. Training Methodologies for Generative Modeling

Generalised Flow Maps are deployed as few-step generative models using self-distillation training strategies that enforce one of the GFM characterisations off-diagonal (i.e., for sts \ne t), alongside Riemannian flow matching along the diagonal:

  • Generalised Lagrangian Self-Distillation (G-LSD): Enforces pathwise ODE consistency in time.
  • Generalised Eulerian Self-Distillation (G-ESD): Enforces stationarity with respect to the flow's starting position, yielding a Riemannian generalisation of mean flows and consistency training.
  • Generalised Progressive Self-Distillation (G-PSD): Enforces the semigroup property, recovering shortcut model analogues.

The overall loss combines a Riemannian flow-matching loss on the diagonal with the relevant self-distillation penalty. Practical training utilises batched sampling, forward passes through neural parametrisations of vs,tv_{s,t}, and stepwise stochastic gradient descent (Davis et al., 24 Oct 2025).

In physics-informed inverse problems such as channel knowledge map (CKM) construction, a variant called linear transport guided flow matching (LT-GFM) models the process as a deterministic ODE along optimal transport paths between prior and target, with supervised alignment of the velocity field to the minimum transport direction (Huang et al., 6 Jan 2026).

4. Domain-Specific Extensions: Lie Groups and Turbulence

The GFM formalism generalises to non-Euclidean settings:

  • Lie Groups: On a Lie group GG with exponential map exp:gG\exp: \mathfrak{g} \to G, the interpolation between g0,g1g_0, g_1 is the exponential curve γ(t)=g0exp(tlog(g01g1))\gamma(t) = g_0 \exp(t \log(g_0^{-1} g_1)). The conditional vector field for flow matching becomes ut(gg1)=log(g1g1)/(1t)u_t(g \mid g_1) = \log(g^{-1} g_1) / (1-t), naturally implemented via group and algebra operations. Neural networks parametrise vector fields by operating in coordinate charts or via group actions, and training minimises the squared-norm difference to the analytic conditional field (Sherry et al., 1 Apr 2025).
  • Turbulence: Brenier's generalised least-action principle reframes ideal fluid mechanics as a variational problem over probability measures on flow maps, yielding "generalised flows" that describe turbulent Lagrangian statistics. The GFM in this context seeks measures minimising expected action, subject to incompressibility and prescribed initial/final couplings. Monte Carlo methods on finite permutation flows enable numerical solution, capturing classical solutions for short lags and coarse-graining turbulent statistics for longer times (Thalabard et al., 2018).

5. Empirical Benchmarks and Comparative Performance

GFMs and their variants have demonstrated empirical advantages in geometric generative modeling:

  • Riemannian Manifolds: On tasks including protein torsion angle synthesis (tori), geospatial catastrophe generation (sphere), rotations (SO(3)), and hyperbolic manifold data, GFM variants achieve state-of-the-art sample quality for single- and few-step inference. For example, on 2D protein tori, one neural-function-evaluation (NFE) yields MMD 0.02–0.05 versus 0.45 for Riemannian flow matching (RFM), with up to 20× fewer steps required to match sample quality. GFM variants also achieve superior or competitive negative log-likelihoods relative to diffusion or flow-based baselines (Davis et al., 24 Oct 2025).
  • Wireless Channel Maps (CKMs): LT-GFM enables real-time, high-fidelity CKM generation with an order-of-magnitude reduction in inference steps (10 Euler steps vs. 1000 for DDPM), a 43% reduction in Fréchet Inception Distance (FID), and a factor-of-25 speedup. In both channel gain map (CGM) and spatial correlation map (SCM) synthesis, physics-informed conditioning (e.g., building masks, edge maps) and Hermitian symmetry are enforced for physical validity (Huang et al., 6 Jan 2026).
Domain GFM Variant Key Metric Improvement Reference
Protein Tori GFM (Riemannian) 20× MMD reduction at 1 NFE (Davis et al., 24 Oct 2025)
SO(3) Rotations GFM Matched NLL, better low-step MMD (Davis et al., 24 Oct 2025)
Wireless CKMs LT-GFM 43% FID reduction, 25× faster inference (Huang et al., 6 Jan 2026)

For turbulence, GFM matches classical Eulerian flows below a critical time, coarse-grains multi-scale structures, and enables explicit statistical characterization of turbulent dispersion (Thalabard et al., 2018).

6. Physical, Geometric, and Algorithmic Constraints

GFMs incorporate domain-specific constraints for enhanced physical and statistical fidelity:

  • Physics-Informed Priors: Integration of edge maps and building masks to preserve environmental discontinuities; enforcement of Hermitian symmetry in complex-valued matrices for SCM validity (Huang et al., 6 Jan 2026).
  • Geometric Consistency: Exponential-map parametrisation and differential-geometric treatment ensure consistent behavior on manifolds, Lie groups, and homogeneous spaces, with explicit extension from Euclidean flow maps (Davis et al., 24 Oct 2025, Sherry et al., 1 Apr 2025).
  • Probabilistic Coarse-Graining: In turbulence, GFMs yield probabilistic descriptions that average fine-scale fluctuations while preserving large-scale coherence, effectively bridging deterministic and stochastic modeling regimes (Thalabard et al., 2018).

A key caveat is the necessity to restrict certain variational formulations (e.g., in turbulence) to time intervals shorter than the intrinsic turnover time for physical relevance. Beyond these, non-deterministic behavior can emerge, potentially reducing physical interpretability (Thalabard et al., 2018).

7. Theoretical and Practical Unification

Generalised Flow Maps unify a range of generative modeling paradigms:

  • When instantiated with appropriate interpolants, exponential maps, and training losses, GFMs encompass consistency models, shortcut models, and mean flows as special cases in both Euclidean and Riemannian geometry.
  • They provide a framework for closed-form, simulation-free generative modeling on matrix Lie groups, enabling pose, rotation, and transformation-aware synthesis (Sherry et al., 1 Apr 2025).
  • The self-distillation-based training schemes enable few-step sampling with high sample quality, providing computational advantages over classical diffusion and flow-matching methods (Davis et al., 24 Oct 2025, Huang et al., 6 Jan 2026).

GFMs thus establish a rigorous, efficient, and extendable foundation for probabilistic transport modeling across a broad range of scientific and engineering domains, from fluid dynamics to wireless communications and geometric machine learning.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Generalised Flow Maps (GFM).