Papers
Topics
Authors
Recent
2000 character limit reached

Physics-Constrained Gaussian Process Regression

Updated 17 January 2026
  • Physics-constrained Gaussian Process Regression integrates physical knowledge via tailored kernels, constraints, and augmented likelihoods to enforce laws like conservation and boundary conditions.
  • It reduces data requirements and improves extrapolation reliability by embedding physical structures directly into the regression framework.
  • This approach is applied in surrogate modeling, inverse problems, and state estimation, offering enhanced interpretability and controlled uncertainty.

Physics-constrained Gaussian process regression (GPR) refers to a family of modeling frameworks that incorporate physical laws, analytical constraints, or mechanistic knowledge directly into the prior, kernel, likelihood, or learning process of GPR. This methodology enhances data efficiency, robustness to extrapolation, and guarantees that predictions adhere to fundamental physical principles, such as conservation, boundary conditions, monotonicity, or integral constraints.

1. Fundamental Principles and Motivation

Standard GPR is a nonparametric Bayesian regression technique that models latent functions as a Gaussian process with a mean function and a covariance kernel. While extremely flexible, unconstrained GPR relies purely on data and typically assumes stationarity, leading to poor performance in data-scarce high-dimensional settings or when the underlying system is governed by strong physical laws. Physics-constrained GPR embeds these known physical structures, reducing the data required for accurate prediction and providing posterior distributions that are consistent with governing equations or inequalities (Chang et al., 2022, Ma et al., 2020, Cross et al., 2023).

Incorporating physics can be achieved by:

  • Designing kernels consistent with Green’s functions or spectral properties of PDEs.
  • Imposing constraints on the GP prior or mean (e.g., boundary conditions).
  • Augmenting the likelihood or the evidence with penalties or soft-constraints derived from physical residuals.
  • Constraining optimization or posterior inference to satisfy monotonicity, normalization, or nonnegativity.

2. Classes of Physical Constraints and Formalisms

Physics constraints in GPR frameworks manifest at multiple levels:

a) Hard Constraints and Boundary Enforcement

Formulation of the GP prior so that all realizations satisfy boundary conditions or operator constraints. E.g., using spectral expansions in the eigenfunctions of the domain’s Laplacian or linear PDE operator, the covariance is constructed as

k(x,x)=n=1MS(λn)ϕn(x)ϕn(x)k(x,x') = \sum_{n=1}^M S(\sqrt{\lambda_n})\,\phi_n(x)\,\phi_n(x')

where (ϕn,λn)(\phi_n,\lambda_n) solve the BVP with prescribed boundary conditions (Gulian et al., 2020, Solin et al., 2019, Seyedheydari et al., 4 Jul 2025). This ensures both the posterior mean and all sample paths satisfy the BCs by construction.

b) Physics-informed Likelihoods and Evidence

Physical knowledge is encoded with a Boltzmann–Gibbs factor in the evidence,

Pphysics(yX)exp[βLphysics(y,X)]P_\text{physics}(y|X) \propto \exp\left[ -\beta L_\text{physics}(y,X) \right]

where LphysicsL_\text{physics} quantifies violations of residuals, boundary misfit, or variational energy, and β\beta tunes the strength of the constraint. The marginal evidence becomes

p(yX)p(fX,θ)pdata(yf)dfPphysics(yX)p(y|X) \propto \int p(f|X,\theta) p_\text{data}(y|f) df \cdot P_\text{physics}(y|X)

which regularizes the data-fit by enforcing proximity to physical laws (Chang et al., 2022).

c) Prior Statistics from Stochastic Models

Empirical mean and (nonstationary) covariance are computed from Monte Carlo realizations of stochastic PDEs or SDEs, so that predicted mean fields automatically satisfy physical linear operators:

μMC(x)=1Mm=1Mu(m)(x), kMC(x,x)=1M1m=1M(u(m)(x)μMC(x))(u(m)(x)μMC(x))\mu_\text{MC}(x) = \frac{1}{M}\sum_{m=1}^M u^{(m)}(x),\ k_\text{MC}(x,x') = \frac{1}{M-1}\sum_{m=1}^M (u^{(m)}(x)-\mu_\text{MC}(x))(u^{(m)}(x')-\mu_\text{MC}(x'))

with the guarantee that the Kriging posterior mean preserves the constraints (e.g., L[y^]=g\mathcal{L}[\hat{y}]=g) up to simulator error (Yang et al., 2018, Ma et al., 2020).

d) Derivative-based and Inequality Constraints

Monotonicity, convexity, nonnegativity, or normalization can be enforced via transformation of the GP prior over functionals, e.g., including derivatives in the joint GP and imposing pointwise constraints on those derivatives or integrals. For monotonic GPs, a probit likelihood on the first derivatives is employed at a set of inducing points:

p(mifi)=Φ(fi/ν)p(m_i|f'_i) = \Phi(f'_i/\nu)

with expectation propagation used for posterior inference (Tran et al., 2022, Pensoneault et al., 2020, Li et al., 6 Jan 2026, Seyedheydari et al., 4 Jul 2025).

3. Representative Methodologies

Deep Kernel Physics-Constrained GPR

A hybrid approach encodes physical laws via a Boltzmann–Gibbs term and parametrizes the covariance as a deep kernel:

k(x,x;θ)=kSE(ϕ(x;θ),ϕ(x;θ))k(x, x';\theta) = k_\text{SE}(\phi(x;\theta), \phi(x';\theta))

where ϕ(;θ)\phi(\cdot; \theta) is a neural-network feature map trained jointly with physics-constrained loss. Minibatched training with “stochastic inducing points” renders computation tractable, enabling uncertainty propagation in high-dimensions with minimal labeled data (Chang et al., 2022).

Joint Posterior with Constraints

The generalized joint prior includes both function values and their transforms:

Cov(f(x) [Lf](x))=(k(x,x)Lxk(x,x) Lxk(x,x)LxLxk(x,x))\mathrm{Cov}\begin{pmatrix} f(x) \ [L f](x') \end{pmatrix} = \begin{pmatrix} k(x, x') & L_{x'} k(x, x') \ L_x k(x, x') & L_x L_{x'} k(x, x') \end{pmatrix}

Conditioning on observations of ff and LfL f yields a posterior that preserves operator constraints globally (Gulian et al., 2020, Cross et al., 2023).

Regularization by Lagrange Multipliers or Pseudo-Observations

Enforcing normalization, conservation, or integral constraints can be cast via Lagrange multiplier optimization in a kernel eigenbasis, or by augmenting the observation vector with “pseudo-measurements” for the constraint:

(μ 1)=(A B)ρ+(ε 0)\begin{pmatrix} \mu \ 1 \end{pmatrix} = \begin{pmatrix} \mathcal{A} \ \mathcal{B} \end{pmatrix} \rho + \begin{pmatrix} \varepsilon \ 0 \end{pmatrix}

leading to joint posteriors that satisfy the constraint exactly (Seyedheydari et al., 4 Jul 2025).

Hybrid Multi-fidelity and Multifidelity GPs

Low-fidelity predictions from physics-based simulators are combined with high-fidelity data via autoregressive co-kriging:

YH(x)=ρYL(x)+Yd(x)Y_H(x) = \rho Y_L(x) + Y_d(x)

with YLY_L physics-informed or Monte-Carlo built, and YdY_d a GP for the discrepancy, so the aggregated model leverages physical structure and corrects for model bias (Yang et al., 2018).

4. Quantitative Impact, Data Efficiency, and Limitations

Physics-constrained GPR demonstrates marked advantages:

  • Order-of-magnitude reduction in required labeled data for accurate uncertainty quantification in PDE surrogates (from O(104)O(10^4) to O(102)O(10^2) samples) (Chang et al., 2022).
  • Preservation of essential physical constraints even under extrapolation, e.g., non-decreasing strength with pressure in surrogate concrete constitutive models, with NRMSE reduced from 11.6% (unconstrained) to 6.7% (constrained), and R² increasing from 0.49 to 0.88 (Li et al., 6 Jan 2026).
  • Propagation of non-stationary, physically meaningful covariances through the forecast horizon in power systems and multivariate coupled oscillatory systems (Ma et al., 2020, Cross et al., 2023).
  • Posterior variances and uncertainty bands that contract dramatically in physics-constrained regions or under functional constraints (e.g., monotonicity or normalization) (Tran et al., 2022, Seyedheydari et al., 4 Jul 2025).
  • In ill-posed inverse problems (e.g., inverse light scattering), constrained GPR prevents physically implausible solutions, guaranteeing property enforcement such as normalization or boundedness (Seyedheydari et al., 4 Jul 2025).

However, key limitations and trade-offs emerge:

  • Setting the scalar β\beta or the relative weighting between data-fit and physical loss requires tuning for optimal bias-variance trade-off (Chang et al., 2022).
  • In non-Gaussian or non-linear constraint settings, the posterior may only be available in approximate form (e.g., via expectation propagation or variational inference) (Tran et al., 2022).
  • Physical constraints may restrict extrapolation flexibility if not properly constructed.
  • Constructing eigenfunction bases on complex domains or with uncertain operators may require significant computation (Gulian et al., 2020, Solin et al., 2019).

5. Computational Complexity and Practical Algorithms

Typical unconstrained GPR scales as O(N3)O(N^3) with NN training points due to kernel inversion. Physics-constrained approaches introduce several computational optimizations:

  • Spectral truncation to mNm \ll N eigenmodes yields complexity O(Nm2+m3)O(N m^2 + m^3), achieving efficient reduced-rank inference (Solin et al., 2019, Gulian et al., 2020, Seyedheydari et al., 4 Jul 2025).
  • Stochastic inducing-point strategies allow scaling to minibatches without loss of constraint enforcement in deep kernel hybrid GPs (Chang et al., 2022).
  • For monotonicity and derivative constraints, moderate MM (10–20) derivative-inducing points keep expectation-propagation tractable (O((N+M)3)O((N+M)^3)) in small-to-moderate NN settings (Tran et al., 2022).
  • In multifidelity frameworks, only a small number of discrepancy kernel hyperparameters require optimization, as the physics-informed prior is fixed (Yang et al., 2018, Yang et al., 2018).

6. Applications, Case Studies, and Empirical Performance

Physics-constrained GPR has been demonstrated in a wide range of domains:

  • Surrogate modeling for PDE-governed fields—diffusion, channel flow, or elasticity—where deep-kernel hybrid GPR provides UQ with minimal supervision (Chang et al., 2022).
  • Probabilistic state estimation in power grids using SDE-based physics-informed priors delivering nonstationary, cross-covariant joint predictions for observed and unobserved states (Ma et al., 2020).
  • Constitutive law surrogate modeling under derivative monotonicity constraints, outperforming empirical models in reliability and uncertainty estimates (Li et al., 6 Jan 2026).
  • Inverse problems such as Fredholm integral equation inversion for particle size distributions, enforcing normalization for physically plausible inference (Seyedheydari et al., 4 Jul 2025).
  • Enforcement of holonomic and non-holonomic mechanical constraints in learned rigid-body dynamics models using Gauss’ principle projections (Geist et al., 2020).
  • Model predictive control for nonlinear systems via GPs constructed with Smith-normal-form-regularized kernels encoding the local linearization of the nonlinear ODE at stable equilibria (Lepp et al., 30 Apr 2025).

A pattern emerges of substantial improvements in reliability, interpretability, and robustness, especially in extrapolation regimes or where labeled data are scarce.

7. Outlook and Extensions

Future directions include:

  • Extending frameworks to time-dependent or nonlinear PDEs.
  • Developing alternative kernel architectures (e.g., graph-based, attention-driven) or operator-based constraints tailored to more complex domains (Chang et al., 2022).
  • Generalization to multi-output and coupled systems via latent force models or co-kriging structures (Camps-Valls et al., 2020, Cross et al., 2023).
  • Rigorous theoretical analysis of generalization, error bounds, and the impact of regularization by physics-constrained evidence.
  • Scalability advances through efficient computation of spectral bases, sparse variational methods, and exploiting structure in complex operators or domains.

Physics-constrained GPR provides a principled path to combining statistical flexibility with mechanistic rigor, bridging the gap between machine learning and scientific computing for uncertainty-aware, data-efficient surrogate and inference tasks across scientific and engineering domains (Chang et al., 2022, Cross et al., 2023).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (14)

Topic to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Physics-Constrained Gaussian Process Regression.