Physics-Conditioned Diffusion Models
- The physics-conditioned diffusion model is a generative framework that integrates explicit physical laws into the denoising process to enforce data consistency and physical plausibility.
- It employs conditioning modules, augmented loss functions, and guided sampling strategies to incorporate domain-specific physics, leading to improved fidelity and robustness.
- By combining data-driven probabilistic methods with first-principles constraints, these models achieve superior performance in inverse problems and high-dimensional scientific applications.
A physics-conditioned diffusion model is a generative framework that combines the stochastic power of diffusion probabilistic models (DPMs) with explicit physical knowledge or constraints, thereby enabling data generation, reconstruction, or completion that is consistent with underlying first-principles physics. These models inject, embed, or impose physical laws or domain-specific operators into the architecture, loss, or sampling process of a diffusion model, facilitating generalization, improved fidelity, and interpretability in scientific, engineering, and inverse problem domains.
1. Mathematical Principles of Physics-Conditioned Diffusion Models
Physics-conditioned diffusion models extend the standard DDPM/SDE paradigm by integrating physics information into either the conditioning mechanisms of the denoising network, the loss function, or the sampling procedure. The canonical forward (noising) process takes the form
where physical constraints are introduced into the reverse process or the parametrization of the model. In physics-informed or conditioned cases, the network is trained and/or sampled with knowledge of a physical operator (such as a PDE residual, a measurement operator, or energy function), or with embeddings encoding first-principles information (e.g., forward operators, signal models, simulation-derived priors).
Notable implementations include:
- Domain-conditioned embeddings: Conditioning the denoising network on physical parameters, measurement geometry, or operators (e.g., MRI encoding matrices, physical boundary conditions) (Bian et al., 2023, Zeng et al., 29 Jan 2026, Zhang et al., 2024).
- Loss augmentation with physics residuals: Adding physics-based loss terms, such as (where is the reconstructed sample at a given step) which can be evaluated via direct simulation, analytic gradients, or black-box metrics (Zeng et al., 29 Jan 2026, Cheng et al., 28 Jun 2025).
- Posterior score combination: Inverse problems (e.g., CT reconstruction) may form the reverse drift as the sum of the data-driven prior score and the physics-based likelihood score, derived either analytically or by differentiation through the measurement model (Li et al., 2023).
- Gradient-based or derivative-free guidance steps: Hybridizing denoising updates with explicit physics-enforcing iterations, such as gradient descent projections (Bian et al., 2023, Cheng et al., 28 Jun 2025) or black-box, evolution-based fitness shaping (Wei et al., 16 Jun 2025).
2. Physics-Conditioned Architectures and Mechanisms
Model architectures supporting physics conditioning exhibit two principal strategies:
a. Explicit Conditioning Modules
Physics operators and parameters (e.g., masks, coil sensitivities, atlas priors for anatomical constraints, system-specific measurement matrices) enter the model at every layer via concatenation, Feature-wise Linear Modulation (FiLM), cross-attention, gated adapters, or external conditional embeddings (Bian et al., 2023, Zhang et al., 2024, Lyu et al., 3 Nov 2025). This ensures the denoiser’s intermediate representations remain physics-aware throughout all diffusion steps.
b. Physics-Guided Noising/Kernels
Physical models may dictate the variance schedule and form of injected noise, matching noise propagation to physical phenomena (e.g., exponential signal decay in MRI, Gauss–Markov models for densities, GP-based spatial kernels for field-valued quantities) (Zhang et al., 2024, Long et al., 2024).
c. Sampling and Inference Strategies
Sampling algorithms often interleave standard reverse diffusion with:
- Projections to enforce hard physics constraints (e.g., k-space data consistency, Laplacian constraints for waves).
- Plug-and-play posterior score fusion (statistically consistent Bayesian merging of data-fitting and prior terms using derivatives of measurement models).
- Test-time gradient or black-box projection steps enforcing goal satisfaction without explicit differentiability (Wei et al., 16 Jun 2025, Bian et al., 2023, Li et al., 2023).
3. Representative Methods and Domains of Application
Physics-conditioned diffusion models have been demonstrated across a broad spectrum of inverse problems, forward modeling tasks, and data-driven surrogate emulation:
| Domain | Physical Conditioning | Reference |
|---|---|---|
| Accelerated MRI/qMRI | k-space encoding, MR signal, data consistency | (Bian et al., 2023) |
| CT Reconstruction | Nonlinear Poisson likelihood (Beer–Lambert), Bayes posterior | (Li et al., 2023) |
| dMRI Synthesis | Signal attenuation, ADC atlas, anatomical atlas prior | (Zhang et al., 2024) |
| Elastic Wave Separation | Laplacian (Helmholtz) separation, velocity fields | (Cheng et al., 28 Jun 2025) |
| Lattice Gauge Theory | Stochastic quantization SDE, Boltzmann factor scaling | (Zhu et al., 8 Feb 2025, Zhu et al., 2024) |
| Multi-physics Surrogates | GP kernel, PDE constraints, arbitrary masking | (Long et al., 2024, Zeng et al., 29 Jan 2026) |
| Tactile/Robo-Grasping | Contact geometry, mass, texture | (Lyu et al., 3 Nov 2025) |
| Air Quality Forecast | Graph diffusion ODE, Laplacian, forecast residuals | (Dong et al., 29 Jun 2025) |
| Garment Deformation | Body pose, cloth material, latent conditioning | (Dumoulin et al., 4 Apr 2025) |
| Microscopy Imaging | Point-spread function, data-fidelity gradients | (Li et al., 2023) |
Many works emphasize that such conditioning enables generalization out-of-distribution (e.g., different coil maps, new anatomy, altered physics parameters) or plug-and-play adaptation to new operators/measurement geometries without retraining (Bian et al., 2023, Li et al., 2023, Zhu et al., 8 Feb 2025, Zhu et al., 2024).
4. Loss Formulations and Training Objectives
Physics-conditioned models utilize joint losses combining standard noise-prediction (denoising) terms and physics-informed regularization. Prototypical objectives can be expressed as
where:
- is the standard denoising loss over noisy samples.
- penalizes violation of physical constraints, e.g., via Laplace residuals, PDE residuals (often or normalized), or algebraic inequalities (Zeng et al., 29 Jan 2026, Cheng et al., 28 Jun 2025).
- For Bayesian score-fusion sampling, posterior drift is formed by adding prior and likelihood scores (Li et al., 2023).
Advanced loss designs include:
- Virtual residuals: synthetic Laplace-sampled constraint violations for stronger tail robustness (Zeng et al., 29 Jan 2026).
- Zero-regularized denoising: for functional/field-valued problems, masking and zeroing prediction on observed or conditioned components (Long et al., 2024).
5. Inference with Physics Constraints
During inference/generation, sampling processes are modified to preserve data fidelity or physical validity:
- Data Consistency Enforcement: After each denoising step, analytic or iterative corrections (e.g., projection onto measured k-space, gradient descent step w.r.t. physical residual) are interleaved, ensuring samples remain within the feasible set of the forward model (Bian et al., 2023, Cheng et al., 28 Jun 2025).
- Adaptive Posterior Score: In plug-and-play posterior sampling, the prior and likelihood gradients are summed, with the likelihood gradient coming from the physics-based measurement model (including nonlinear CT or arbitrary physical mapping, often via auto-differentiation) (Li et al., 2023).
- Evolutionary or Metropolis Refinement: In non-differentiable or black-box cases, derivative-free update rules or Metropolis-adjusted Langevin steps enforce compliance, eliminating the need for explicit gradients of potentially intractable physics (Wei et al., 16 Jun 2025, Zhu et al., 8 Feb 2025).
6. Empirical Performance and Generalization
Physics-conditioned diffusion models consistently report:
- Superior accuracy and fidelity relative to both purely data-driven DPMs and prior (U-Net, adversarial, score-based) architectures, particularly in underdetermined or data-limited settings (Bian et al., 2023, Zeng et al., 29 Jan 2026, Zhang et al., 2024).
- Intrinsic generalization to variation in physics parameters (e.g., MRI coil maps, material properties, system geometries) and ability to accommodate domain shifts with minimal retraining (Bian et al., 2023, Zhu et al., 8 Feb 2025).
- Uncertainty quantification via sampling, providing full posterior distributions for ill-posed or ambiguous inverse problems (Li et al., 2023, Long et al., 2024).
- Sample diversity and avoidance of hallucinated artefacts, with physics-informed constraints acting as regularization against non-physical outputs (Li et al., 2023, Zhang et al., 2024).
A selection of quantitative results includes:
- Achieving PSNR/SSIM/NMSE gains at high acceleration in MRI and CT tasks (Bian et al., 2023, Li et al., 2023)
- Substantial reduction in PDE/projection residuals for flow and field problems (Zeng et al., 29 Jan 2026, Long et al., 2024, Cheng et al., 28 Jun 2025)
- Robust simulation of physical field configurations in high-dimensional lattice and multi-physics problems (Zhu et al., 8 Feb 2025, Zhu et al., 2024, Long et al., 2024)
- Strong improvements in constraint satisfaction (e.g., <1% outlier violation via Laplace-residual loss) (Zeng et al., 29 Jan 2026).
7. Outlook, Generalizations, and Limitations
Physics-conditioned diffusion modeling is an evolving paradigm with broad flexibility:
- Modularity: Any residual operator, physical constraint, or conditional information compatible with neural module conditioning is, in principle, incorporable; in multi-functional models, arbitrary fields can be masked/zeroed as needed (Long et al., 2024, Zeng et al., 29 Jan 2026).
- Scalability: When physics operators are differentiable, gradients can be efficiently computed; for non-differentiable black-boxes, evolutionary strategies or acceptance-based sampling maintain tractability (Wei et al., 16 Jun 2025).
- Limitations: Computational cost of repeated projections or gradient steps, requirement for differentiable or numerically evaluable physics, and the need for carefully balanced loss weighting (to avoid over- or under-enforcement of constraints).
- Extensibility: Future work may integrate more sophisticated conditional encoders (e.g., CLIP-like), task-dependent physics (multi-scale, multi-domain), or adversarial/perceptual metrics. For many models, extension to higher dimensions, non-Abelian symmetry, or hierarchical multi-physics constraints is structurally straightforward (Zhu et al., 2024, Long et al., 2024).
Physics-conditioned diffusion models constitute a principled bridge between deep generative learning and scientific-first principles modeling, offering a unified framework for simulating, reconstructing, and inferring high-dimensional physical systems with strong guarantees of physical plausibility and scientific utility.