Mean Flows: Theory and Applications
- Mean flows are space- or time-averaged components that filter out fast fluctuations to reveal coherent transport in dynamical systems.
- They are computed via analytical, numerical, and machine learning methods, utilizing differential identities and multi-scale expansions.
- Understanding mean flows informs applications across fluid mechanics, plasma physics, and generative modeling, optimizing predictions and model efficiency.
Mean flows are space- and/or time-averaged components of a dynamical system, representing coherent structures, large-scale drifts, or systematic velocities that emerge through the interplay of fast, fluctuating dynamics and nonlinear transport. In fluid mechanics, plasma physics, geophysical flows, and modern generative modeling, mean flows encapsulate principles of scale separation, nonlinear feedback, and transport optimization, and are defined, computed, and leveraged through a spectrum of analytical, numerical, and machine learning methodologies.
1. Mathematical Definitions and Mean Flow Identities
A mean flow is constructed via averaging procedures that filter out fast or small-scale fluctuations. In classical fluid mechanics, one decomposes the velocity field as , where is a spatial, temporal, or ensemble mean, and are zero-mean fluctuations. In nonlinear generative modeling, the analogous object is the average velocity field
where is the instantaneous velocity along a transport path, and captures the net displacement rate over the interval (Geng et al., 19 May 2025).
A fundamental result is the MeanFlow differential identity: with . This exact relationship allows mean flows to be learned or computed without explicit integration, facilitating one-step mappings in generative models and direct analysis of net transport in physical systems (Geng et al., 19 May 2025, Zhong et al., 11 Mar 2026).
On Riemannian manifolds, the average (mean) velocity requires parallel transport of tangent vectors, yielding
where 0 is parallel transport along a reference curve 1 (Zhong et al., 11 Mar 2026, Woo et al., 8 Feb 2026). The Riemannian MeanFlow identity generalizes the Euclidean case by replacing 2 with the Levi-Civita covariant derivative.
2. Physical Mechanisms and Theoretical Context
Turbulent and Wave-driven Mean Flows
In 2D turbulence, mean flows (jets, vortices) arise from the inverse cascade of energy, whereby small-scale forcing generates large coherent structures. The statistical balance is governed by
3
where 4 is the mean flow, 5 the Reynolds stress, and 6 the energy injection rate (Frishman, 2017). These relations hold for isotropic and anisotropic forcings, and are robust to details of dissipation.
Wave-driven mean flows emerge from interactions such as wave–wave and wave–mean coupling. In stratified or rotating systems (e.g., planetary atmospheres, laboratory tanks), internal waves excited by turbulence or boundary forcing deposit momentum via the divergence of wave-induced Reynolds stresses, driving periodic or steady mean flows that can oscillate or reverse on timescales separated by orders of magnitude from the underlying wave periods (Couston et al., 2018).
Mean Flows in Rotating Systems
Mechanics of mean zonal flows generated by weak mechanical forcings (precession, libration) in rotating spheroids—and more generally in planetary interiors—require accounting for nonlinear Ekman-layer self-interaction. The amplitude of mean flows scales quadratically with forcing (7), and their spatial profile is sensitive to geometry, stratification, and presence of solid inner cores (Cébron et al., 2021).
Plasma Mean Flows and Multiphysics Couplings
In magnetized plasmas, ExB mean flows are critically influenced not just by Reynolds stresses arising from velocity fluctuations but also by diamagnetic Reynolds stresses, curvature-driven cross-stresses, and polarization effects. These additional mean-stress terms emerge from ion pressure fluctuations and fast Larmor-scale physics, and become dominant under steep gradients or in the presence of strong curvature, directly impacting formation and stability of transport barriers (Madsen et al., 2016).
3. Analytical, Computational, and Modeling Frameworks
Multiple-Scale and Generalized Lagrangian Mean Expansions
Mean flow analysis in inhomogeneous or oscillatory environments is formally addressed by multi-scale expansions along mean flows. For example, in rapidly advected, oscillatory convection-diffusion problems: 8 the effective diffusion is characterized by ergodic averages along the mean-flow trajectories, requiring the machinery of ergodic algebras and weak Σ-convergence (Holding et al., 2016).
In time-oscillatory flows (e.g., wave streaming, viscous streaming), the Generalized Lagrangian Mean (GLM) theory defines mean trajectories 9 and mean velocities 0, providing a rigorous split between slow mean transport and fast oscillatory fluctuations. The mean Lagrangian velocity incorporates second-order corrections such as Stokes drift (Provost et al., 2019).
Mean Flows in Nonlinear Wave Modulation
The mean-flow correction is essential in high-order nonlinear Schrödinger (NLS) models, especially in finite or variable depth, where the mean flow at the surface is given by
1
generalizing the Dysthe mean-flow term and enabling accurate prediction of fluid particle trajectories and group velocity corrections in laboratory and oceanic settings (Gomel et al., 2023).
Mean-Flow Modeling in Generative Learning
In generative modeling, the mean flow framework replaces numerical ODE integration (which requires many function evaluations per sample) with direct learning and inference of average velocity fields. The trained neural operator 2 enables one-step (1-NFE) sampling via: 3 substantially accelerating inference while matching or surpassing the quality of multi-step methods (Geng et al., 19 May 2025). Extensions to manifold-valued data require geometry-aware generalizations such as log-map and parallel transport operations, as implemented in Riemannian MeanFlow (RMF) (Zhong et al., 11 Mar 2026, Woo et al., 8 Feb 2026).
4. Applications and Impact Across Scientific Domains
Fluid Mechanics and Geophysics
- Turbulence: Understanding maintenance and structure of large-scale jets, vortices, and zonal flows in geophysical and astrophysical flows—quantified by the energy flux, Reynolds and eddy stresses (Frishman, 2017, Cébron et al., 2021, Currie et al., 2016).
- Internal Wave–induced Mean Flows: Mechanisms of streaming and mean Lagrangian drift with implications for tracer/pollutant transport and formation of shear layers (Couston et al., 2018, Beckebanze et al., 2018, Provost et al., 2019).
Plasma Physics
- Transport Barriers: Accurate mean-flow predictions in magnetically confined fusion devices demand modeling all contributing stresses, including those arising from microscopic pressure and velocity cross-phases, not just classical Reynolds stress (Madsen et al., 2016).
Oceanography
- Wave-Group Mean Flows: Modeling wave-averaged currents essential for interpreting laboratory flume experiments, coastal setup, infragravity phenomena, and pollutant/biological transport (Gomel et al., 2023).
Machine Learning and Generative Models
- One-Step Generative Models: MeanFlow architectures offer dramatic improvements in inference speed and sample quality for image, point cloud, voice, and molecular generation, closing the gap with multi-step flows and bypassing the need for distillation or curriculum learning (Geng et al., 19 May 2025, Akbari et al., 26 Sep 2025, Kaneko et al., 20 Feb 2026, Wang et al., 25 Sep 2025).
- Geometry-Aware generation: RMF frameworks enable efficient and accurate one-step or few-step sampling in high-dimensional manifold domains such as protein backbones or DNA sequences (Zhong et al., 11 Mar 2026, Woo et al., 8 Feb 2026).
- Noise-injection Refinement: RMFlow methodology demonstrates that post-mean flow refinement with a targeted noise injection can enhance sample diversity and fidelity while retaining computational advantages (Huang et al., 31 Jan 2026).
5. Methodological and Computational Advances
Supervision and Training
- MeanFlow Identity-based Loss: Training the mean-flow neural operator relies on exact, differential identities that tie average and instantaneous velocities, reducing the training to a local regression problem (Geng et al., 19 May 2025, Zhong et al., 11 Mar 2026).
- Stabilization Techniques: JVP-based derivatives, conflict-aware multi-task optimization, and curriculum in interval-length improve stability and convergence, particularly for high-dimensional or manifold-geometric data (Zhong et al., 11 Mar 2026, Woo et al., 8 Feb 2026, Wang et al., 25 Sep 2025).
Theoretical Generalization
- Riemannian and Non-Euclidean Flows: Extension from Euclidean to general Riemannian manifolds via log/exponential map, parallel transport, and covariant derivatives permits intrinsic, coordinate-invariant definitions and computations (Zhong et al., 11 Mar 2026, Woo et al., 8 Feb 2026).
- Ensemble-Averaged Input–Output Models: The mean resolvent operator framework provides a statistically optimal input–output mapping for statistically steady flows, unifying LTI/Koopman perspectives and enabling rigorous linearization beyond base flows (Leclercq et al., 2022).
- Structural and Symmetry Constraints: Equivariant flows and invariance under isometries inform both PDE flows on manifolds (Castéras, 2012) and deep learning architectures for improved generalization.
6. Experimental Validation and Practical Consequences
- Quantitative Agreement with DNS and Laboratory Data: Mean-flow theory accurately reproduces the amplitude, shape, and scaling of observed zonal jets, streaming eddies, transport paths, and stress balances across multiple settings (Frishman, 2017, Beckebanze et al., 2018, Cébron et al., 2021).
- Performance in Generative Tasks: One-step MeanFlow implementations achieve FID 3.43 on ImageNet 256×256 (XL/2, 676M parameters) and outperform prior 1-NFE flow/diffusion models, achieving competitive or superior results to multi-step approaches at a fraction of computational cost (Geng et al., 19 May 2025, Akbari et al., 26 Sep 2025, Kaneko et al., 20 Feb 2026).
- Improved Matching of Laboratory Wave Flume Experiments: Incorporating correct mean-flow corrections in NLS-based models leads to accurate predictions of wave group-focusing, set-down, and tracer drift in arbitrary depth (Gomel et al., 2023).
7. Outlook and Extensions
Mean flows remain central in problems featuring multiscale interactions between fluctuations and coherent transport. Analytical advances (mean identities, covariance structure, Reynolds stress parameterizations), geometric generalizations (Riemannian flows, equivariant field flows), and data-driven implementations (neural mean-flows, reward-guided generation) continue to expand the impact and utility of mean flow theory. Prospective research directions include adaptive interval scheduling in mean-flow samplers, further exploitation of geometry in manifold-based tasks, and refined modeling of mean flows in strongly nonlinear, intermittency-dominated regimes (Geng et al., 19 May 2025, Zhong et al., 11 Mar 2026, Leclercq et al., 2022).