Physics-Consistent Climate Prior
- Physics-consistent climate priors are probabilistic constraints that encode conservation laws, PDE structures, and thermodynamic principles in climate models.
- The integration of objective methods like Jeffreys’ prior and physics-informed deep learning ensures robust uncertainty quantification and model calibration.
- These priors enable stochastic, dynamical, and surrogate models to deliver physically plausible predictions across reduced-order and generative frameworks.
A physics-consistent climate prior is any probabilistic or structural constraint on parameter space, model outputs, or generative learning representations that encodes the mathematical, physical, or thermodynamic laws governing the climate system. Its purpose is to ensure that probabilistic climate predictions, parameter inference, and generative surrogate models remain compatible with conservation laws, PDE system structure, statistical mechanics, or process-level physical drivers. Such priors are critical for robust uncertainty quantification, trustworthy model calibration, interpretable deep learning surrogates, and Bayesian inference, and they appear in a variety of forms including information-theoretic (e.g., Jeffreys’ prior), structural (e.g., energy-stability or PDE constraints), or data-driven (e.g., emergent-constraint regressions, or physics-masked neural nets).
1. Objective and Information-Theoretic Priors: Jeffreys’ Prior for Climate Models
A foundational approach to physics-consistent Bayesian inference is the objective Jeffreys’ prior, defined solely in terms of the model likelihood and its parameter sensitivities. For a vector of model parameters and a likelihood , the Fisher information matrix is
and Jeffreys' prior is given by
For climate models with multivariate normal outputs and constant covariance , the prior simplifies to
where is the sensitivity (Jacobian) of the ensemble-mean response with respect to (Jewson et al., 2010). In the case of parameter-dependent covariance, the further general formula
applies, maintaining invariance under reparameterization and encoding only physical properties estimated from model ensembles (Jewson et al., 2010).
By relying exclusively on properties of the model’s response, Jeffreys’ prior excludes subjective tuning, respects parameter-identifiability structure, and renders the resulting climate predictions fully objective and physics-tethered.
2. Physics-Consistent Priors for Stochastic and Dynamical Models
For parameter inference in stochastic climate models (), physical consistency mandates that the drift and diffusion parameters enforce key structural requirements such as global energy stability. A sufficient condition is the negative-definiteness of the energy quadratic form derived from the highest-order nonlinearities (e.g., for cubic drift, the matrix in must satisfy ). This constraint is enforced by truncating the prior:
(where is the symmetric part of ) or equivalently by a soft penalty on positive eigenvalues. MCMC sampling on is then performed under this constraint, with tailored algorithms for truncated Gaussian parameter draws and imputed path sampling (Peavoy et al., 2013). This construction yields posterior ensembles over model parameters and latent trajectories that are physically admissible by construction.
3. Physics-Constrained Priors in Deep Learning Surrogates
Enforcing physics consistency in learned parameterizations or generative surrogates requires explicit algorithmic regularizers or constraining architectures:
- Neural parameterization masking: The PCMasking framework minimizes an -norm penalty on the first-layer weights, enforcing an a priori sparsity reflecting the belief that only a small subset of coarse-scale features physically drive subgrid tendencies. The regularized loss is
followed by hard-thresholding weights and retraining on masked inputs (Kühbacher et al., 2024). The result is a neural surrogate that ignores spurious non-causal correlations and selects only physically plausible drivers. Offline experiments demonstrate matching predictive skill (R² ≈ 0.8–0.9 in the mid-troposphere), perfect alignment of SHAP-interpretability maps with known physics, and robust generalization to perturbed climates.
- Physics-informed stochastic emulators: PACER embeds an advection–diffusion PDE (possibly with stochastic noise) as a hard prior in the neural architecture, discretizes with periodic spherical boundary conditions, and empirically initializes transport parameters from emissions fields. By propagating both mean and variance of climate fields, PACER produces a retained Gaussian uncertainty quantification at each step. The system-level prior,
is calibrated and applied end-to-end (Saleem et al., 2024).
- Zero-shot generative priors: ProFlow enables strict compliance with governing PDEs via a two-step proximal algorithm: projection onto the solution space of the physical PDEs and enforcement of observations, followed by transport along the pre-trained flow trajectory. The probabilistic prior is constructed via Functional Flow Matching and corrected at sampling time by proximal steps—no retraining required (Yu et al., 28 Jan 2026).
4. Emergent Constraints as Physics-Consistent Statistical Priors
Emergent constraints exploit inter-model physical relationships (such as those implied by the fluctuation–dissipation theorem) to build statistical priors for unknown climate sensitivities. If ensemble models exhibit a tight empirical regression between an observable (e.g., present-day variability) and a future response (e.g., ECS or TCR) validated by theory, the relationship
is inverted via the observed to obtain an informative prior on :
For example, using CMIP5 models and observations, ECS may be constrained to C from global-mean temperature variability (Williamson et al., 2020). The critical requirement is a robust theoretical underpinning (e.g., linear response, radiative-interaction physics) to avoid spurious correlations.
5. Physics-Based Priors in Conceptual and Reduced-Order Models
Reduced-order energy-balance models, such as those based on the vertical radiative–convective equilibrium, provide analytic, physically closed priors. The planetary radiation balance
with emission temperature tied to the surface temperature and greenhouse gas optical depth
and an elevating emission altitude , produces a deterministic mapping from (GHG concentration) to OLR and tropospheric overturning (Benestad, 2013). These models encode physical entanglements (e.g., between radiative resistance and convective compensation) and can be embedded as priors in Bayesian hierarchical models, with validation against reanalysis trends (e.g., observed increase of m per decade).
6. Physics Guidance in Generative Models for Remote Sensing or Downscaling
Physics-consistent priors can be embedded in generative vision models by conditioning on segmentation maps derived from physics-based simulators (e.g., SLOSH for flood extent). For satellite image translation (e.g., flood and reforestation visualizations), the conditioning on segmentation maps (from shallow-water PDE solves) enforces that synthesized scenes are physically consistent with hydrodynamic constraints. The “physics-conditioned” pix2pixHD GAN leverages standard adversarial, feature-matching, and perceptual losses, but the crucial prior is achieved by concatenating the physics-guided mask to the generator input. Empirical comparisons show that including such physics-guided conditioning increases both the intersection-over-union (IoU) and composite plausibility metrics relative to unconstrained deep learning baselines (IoU = 0.502 versus 0.226 for GAN without physics) (Lütjens et al., 2021).
7. Limitations, Assumptions, and Best Practices
Physics-consistent climate priors—though structurally rigorous—rely on the correct specification of governing equations, the validity of statistical assumptions (e.g., normality, invariance, or stationarity), and the identification of meaningful physical drivers. For Jeffreys' prior, core limitations include the assumption of multivariate normality and constant or tractable parameterizations of covariance; in deep learning frameworks, a plausible implication is that sparsity or hard masking may miss weak but relevant long-range interactions; for emergent constraints, the candidate relationship must be validated by both ensemble universality and physical theory, with care to avoid overfitting or spurious statistical associations (Jewson et al., 2010, Jewson et al., 2010, Kühbacher et al., 2024, Williamson et al., 2020, Saleem et al., 2024, Yu et al., 28 Jan 2026). It remains essential to balance minimal informativeness with scientific rigor and to embed physical knowledge at all stages of the computational and statistical workflow.