Papers
Topics
Authors
Recent
2000 character limit reached

Physics-Informed Gaussian Process Regression

Updated 13 January 2026
  • Physics-Informed Gaussian Process Regression is a nonparametric Bayesian framework that rigorously incorporates physical laws by embedding differential equations and boundary constraints into the GP prior.
  • It employs spectral expansions and boundary-respecting covariance kernels to construct reduced-rank, operator-consistent models for efficient uncertainty quantification.
  • The approach fuses scattered observations of states and forcing terms using GP conditioning, achieving high accuracy and global enforcement of physical constraints.

Physics-Informed Gaussian Process Regression (PI-GPR) refers to a broad class of nonparametric Bayesian regression frameworks that rigorously encode physical laws—typically differential equations and boundary or structural constraints—into the mean, covariance, and inference process of Gaussian process (GP) models. These frameworks generalize standard data-driven GP regression by ensuring model outputs are not just statistically consistent with sparse data, but that they also satisfy, either exactly or probabilistically, governing physical laws such as boundary value problems (BVPs), partial differential equations (PDEs), conservation relations, and operator-induced constraints.

1. Mathematical Foundation and Problem Setting

Physics-informed GPR is rooted in constructing GP priors for unknown fields u(x)u(x), where xx is a point in a domain ΩRd\Omega\subset \mathbb{R}^d, that not only interpolate noisy, finite data {(xj,yj)}\{(x^j, y^j)\}, but also satisfy physical laws of the general form: Lu(x)=f(x),xΩ;Bu(x)=g(x),xΩ,\mathcal{L}u(x) = f(x),\qquad x\in\Omega; \quad \mathcal{B}u(x) = g(x),\quad x\in \partial\Omega, with L\mathcal{L} a known linear (often elliptic or parabolic) differential operator and B\mathcal{B} a boundary operator (e.g., Dirichlet or Neumann type) (Gulian et al., 2020).

The central idea is to design the GP prior such that the sample paths of uu automatically (hard constraint) or softly (probabilistic/penalized constraint) satisfy the PDE and boundary conditions, with scattered or partial observations available on uu, ff, or other quantities linked via L\mathcal{L} or B\mathcal{B}.

2. Boundary-Respecting GP Priors, Spectral Expansions, and Kernel Design

A key technical mechanism is the construction of GP covariance kernels k(x,x)k(x,x') whose reproducing-kernel Hilbert space consists only of functions satisfying the imposed boundary conditions: k(x,x)=i=1MS(λi)φi(x)φi(x),k(x, x') = \sum_{i=1}^M S(\sqrt{\lambda_i})\,\varphi_i(x)\,\varphi_i(x'), where {φi,λi}\{\varphi_i, \lambda_i\} are the eigenfunctions and eigenvalues of the boundary-value operator (L,B)(\mathcal{L},\mathcal{B}), i.e., Lφi=λiφi\mathcal{L}\varphi_i = \lambda_i\varphi_i in Ω\Omega and Bφi=0\mathcal{B}\varphi_i=0 on Ω\partial\Omega, and S()S(\cdot) is a user-chosen spectral density (e.g., based on squared-exponential kernels) (Gulian et al., 2020). The expansion is truncated at MM modes, yielding a reduced-rank representation, and enforces boundary constraints globally, not just at finitely many "virtual" points.

This approach allows k(,x)k(\cdot, x') to always satisfy B\mathcal{B}, so any GP sample path uu will reside in the boundary-constrained function space. The GP prior then simultaneously controls smoothness (via S()S(\cdot)) and physical admissibility (via {φi}\{\varphi_i\}).

3. Operator-Consistent Joint GPs and Co-Kriging for State and Forcing

Since L\mathcal{L} is linear, acting on the GP prior for uu yields a joint GP for (u,f)(u, f) with block covariances: (u(x) f(x))GP(0,[KuuKuf KfuKff])\begin{pmatrix} u(x) \ f(x) \end{pmatrix} \sim \mathcal{GP}\left(0, \begin{bmatrix} K_{uu} & K_{uf} \ K_{fu} & K_{ff} \end{bmatrix}\right) where, in spectral coordinates,

Kuu(x,x)==1MS(λ)φ(x)φ(x), Kuf(x,x)==1MS(λ)λφ(x)φ(x), Kff(x,x)==1MS(λ)λ2φ(x)φ(x).\begin{aligned} K_{uu}(x,x') &= \sum_{\ell=1}^M S(\sqrt{\lambda_\ell})\,\varphi_\ell(x)\varphi_\ell(x'), \ K_{uf}(x,x') &= \sum_{\ell=1}^M S(\sqrt{\lambda_\ell})\,\lambda_\ell\,\varphi_\ell(x)\varphi_\ell(x'), \ K_{ff}(x,x') &= \sum_{\ell=1}^M S(\sqrt{\lambda_\ell})\,\lambda_\ell^2\,\varphi_\ell(x)\varphi_\ell(x'). \end{aligned}

This enables fusing scattered data on uu and/or ff via standard GP conditioning, with all prior regularity and constraint properties preserved by construction (Gulian et al., 2020).

4. Posterior Inference with Noisy Data and Efficient Algorithms

Given noisy, finite data yu=u(Xu)+ϵuy_u = u(X_u) + \epsilon_u, yf=f(Xf)+ϵfy_f = f(X_f) + \epsilon_f, with joint observation vector yally_{\text{all}} and covariance

K~=Kjoint+σ2I=ΦΛΦ+σ2I,\widetilde{K} = K_{\text{joint}} + \sigma^2 I = \Phi\Lambda\Phi^\top + \sigma^2 I,

posterior means and covariances are given by: μudata(x)=[Kuu(x,Xu),  Kuf(x,Xf)]K~1yall\mu_{u|data}(x_*) = [K_{uu}(x_*, X_u),\; K_{uf}(x_*, X_f)]\,\widetilde{K}^{-1} y_{\text{all}}

Σudata(x,x)=Kuu(x,x)[Kuu(x,Xu),Kuf(x,Xf)]K~1[Kuu(Xu,x);Kfu(Xf,x)]\Sigma_{u|data}(x_*, x_*') = K_{uu}(x_*, x_*') - [K_{uu}(x_*, X_u), K_{uf}(x_*, X_f)]\,\widetilde{K}^{-1}\,[K_{uu}(X_u, x_*'); K_{fu}(X_f, x_*')]

where all block matrices are explicitly constructed via the \ell-indexed formulas above (Gulian et al., 2020).

The computational efficiency results from the reduced-rank form K~=ΦΛΦ+σ2I\widetilde{K} = \Phi\Lambda\Phi^\top + \sigma^2 I, so that the Woodbury identity and log-determinant formulas allow all inversion and determinant operations to reduce to an M×MM \times M system, giving O(NM2+M3)O(NM^2 + M^3) scaling for NN data points and MNM \ll N kernel modes.

5. Model Selection, Hyperparameter Estimation, and Practical Considerations

Hyperparameters (spectral weights S()S(\cdot), observation noise σ2\sigma^2, and possibly kernel lengthscales and amplitudes) are estimated by maximizing the marginal likelihood: (θ)=12yK~1y12logK~N2log2π.\ell(\theta) = -\frac{1}{2} y^\top \widetilde{K}^{-1} y - \frac{1}{2} \log|\widetilde{K}| - \frac{N}{2} \log 2\pi. By differentiating the reduced-rank determinants and inverses with respect to θ\theta, gradients for quasi-Newton optimization are available analytically (Gulian et al., 2020).

Homogeneous (zero) boundary data are enforced globally, but inhomogeneous boundaries Bu=gB u=g may be handled via reduction: u=u1+u2u = u_1 + u_2 with Lu2=0L u_2 = 0, Bu2=gB u_2 = g, and u1u_1 solved by the method above with homogeneous BC. This is not detailed in full in the primary summary but forms the generalization route.

6. Numerical Results and Comparative Advantages

Illustrative results on 1D Poisson (u=f-u''=f, u(0)=u(1)=0u(0)=u(1)=0) using only five scattered ff-observations and eight kernel modes demonstrate global 2\ell^2-error below 1%1\%, with error saturated at low noise. By contrast, a naive "physics-informed" GPR that enforces only f=Luf=Lu via operator-co-kriging and a generic kernel drifts by many tens of percent unless additional explicit boundary data are supplied (Gulian et al., 2020). In 2D, for mixed-BC Helmholtz problems (Dirichlet + Neumann boundaries), constructing {φm,n}\{\varphi_{m,n}\} as BC-satisfying basis yields BVP-GPR with 2.9%2.9\% error—outperforming PDE-GPR without built-in BCs (5.3%5.3\% error). Enforcing the boundary constraints in the prior also collapses posterior variance to zero on Ω\partial\Omega (Gulian et al., 2020).

This demonstrates that leveraging BC-satisfying spectral kernels provides superior accuracy, stability (especially when solving for uu from ff-alone), and enables correct uncertainty quantification globally—including at the domain boundary—without the need for hand-placed virtual points.

7. Connections, Extensions, and General Framework

Physics-informed GP regression with spectral, boundary-enforcing kernels is one thread in a larger landscape that includes:

  • White-box, operator-consistent kernels for exact solution spaces of PDEs (commutative algebraic constructions (Li et al., 6 Feb 2025)), Green's function (Brownian bridge) GP priors (Alberts et al., 28 Feb 2025), and methods using bounded-linear-operator observations for general PDE/inverse problems (Pförtner et al., 2022).
  • Hybrid models combining physics-constrained priors with learned kernel structure (e.g., deep kernel learning with physics-based Boltzmann-Gibbs priors or energy functionals) (Chang et al., 2022).
  • Approaches that encode constraints via derivative coupling, operator differentiation in the GP prior, or as soft-constraint terms in the likelihood (AutoIP (Long et al., 2022), PhIK (Yang et al., 2018)).
  • Multifidelity schemes (CoPhIK) that couple physics-based low-fidelity GP priors with a data-driven discrepancy GP for high-fidelity prediction (Yang et al., 2018).
  • Extensions to nonlinear equations, inverse problems, and uncertainty quantification in complex physical systems (e.g., power grids (Ma et al., 2020), variational and Bayesian interpretations (Alberts et al., 28 Feb 2025), structural health monitoring (Tondo et al., 2023)), and systematic kernel construction for constrained, multi-output vector fields (e.g., divergence-free and boundary satisfied in fluids (Padilla-Segarra et al., 23 Jul 2025)).

This ecosystem provides a rigorous probabilistic framework to unify model-based and data-driven reasoning, generalizing classical numerical PDE solvers, inverse problem methodology, and model calibration, while adding scalable, certified uncertainty quantification.


References (arXiv IDs):

  • (Gulian et al., 2020): "Gaussian Process Regression constrained by Boundary Value Problems"
  • (Yang et al., 2018): "Physics-Information-Aided Kriging: Constructing Covariance Functions using Stochastic Simulation Models"
  • (Li et al., 6 Feb 2025): "Gaussian Process Regression for Inverse Problems in Linear PDEs"
  • (Chang et al., 2022): "A hybrid data driven-physics constrained Gaussian process regression framework with deep kernel for uncertainty quantification"
  • (Pförtner et al., 2022): "Physics-Informed Gaussian Process Regression Generalizes Linear PDE Solvers"
  • (Cross et al., 2023): "A spectrum of physics-informed Gaussian processes for regression in engineering"
  • (Alberts et al., 28 Feb 2025): "An interpretation of the Brownian bridge as a physics-informed prior for the Poisson equation"
  • (Long et al., 2022): "AutoIP: A United Framework to Integrate Physics into Gaussian Processes"
  • (Yang et al., 2018): "Physics-Informed CoKriging: A Gaussian-Process-Regression-Based Multifidelity Method for Data-Model Convergence"
  • (Tondo et al., 2023): "Physics-informed Gaussian process model for Euler-Bernoulli beam elements"
  • (Padilla-Segarra et al., 23 Jul 2025): "Physics-informed, boundary-constrained Gaussian process regression for the reconstruction of fluid flow fields"

Topic to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Physics-Informed Gaussian Process Regression.