Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
96 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
48 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Physics-informed machine learning as a kernel method (2402.07514v2)

Published 12 Feb 2024 in cs.AI, math.ST, and stat.TH

Abstract: Physics-informed machine learning combines the expressiveness of data-based approaches with the interpretability of physical models. In this context, we consider a general regression problem where the empirical risk is regularized by a partial differential equation that quantifies the physical inconsistency. We prove that for linear differential priors, the problem can be formulated as a kernel regression task. Taking advantage of kernel theory, we derive convergence rates for the minimizer of the regularized risk and show that it converges at least at the Sobolev minimax rate. However, faster rates can be achieved, depending on the physical error. This principle is illustrated with a one-dimensional example, supporting the claim that regularizing the empirical risk with physical information can be beneficial to the statistical performance of estimators.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (41)
  1. M.S. Agranovich. Sobolev Spaces, Their Generalizations and Elliptic Problems in Smooth and Lipschitz Domains. Springer, Cham, 2015.
  2. Some first results on the consistency of spatial regression with partial differential equation regularization. Statistica Sinica, 32:209–238, 2022.
  3. Uncovering near-wall blood flow from sparse data with physics-informed neural networks. Physics of Fluids, 33:071905, 2021.
  4. Blood flow velocity field estimation via spatial regression with PDE penalization. Journal of the American Statistical Association, 110:1057–1071, 2015.
  5. Error analysis of kernel/GP methods for nonlinear and parametric PDEs. arXiv:2305.04962, 2023.
  6. G. Blanchard and N. Mücke. Kernel regression, minimax rates and effective dimensionality: Beyond the regular case. Analysis and Applications, 18:683–696, 2020.
  7. Concentration Inequalities: A Nonasymptotic Theory of Independence. Oxford University Press, Oxford, 2013.
  8. H. Brezis. Functional Analysis, Sobolev Spaces and Partial Differential Equations. Springer, New York, 2010.
  9. A. Caponnetto and E. De Vito. Optimal rates for the regularized least-squares algorithm. Foundations of Computational Mathematics, 7:331–368, 2007.
  10. Scientific machine learning through physics-informed neural networks: Where we are and what’s next. Journal of Scientific Computing, 92:88, 2022.
  11. Deep learning for physical processes: Incorporating prior scientific knowledge. Journal of Statistical Mechanics: Theory and Experiment, page 124009, 2019.
  12. Convergence rates for learning linear operators from noisy data. SIAM/ASA Journal on Uncertainty Quantification, 11:480–513, 2023.
  13. T. De Ryck and S. Mishra. Error analysis for physics informed neural networks (PINNs) approximating Kolmogorov PDEs. Advances in Computational Mathematics, 48:79, 2022.
  14. On the approximation of functions by tanh neural networks. Neural Networks, 143:732–750, 2021.
  15. Convergence and error analysis of PINNs. arXiv:2305.01240, 2023.
  16. L.C. Evans. Partial Differential Equations, volume 19 of Graduate Studies in Mathematics. American Mathematical Society, Providence, 2nd edition, 2010.
  17. Some first inferential tools for spatial regression with differential regularization. Journal of Multivariate Analysis, 189:104866, 2022.
  18. Physics-informed machine learning: A survey on problems, methods and applications. arXiv:2211.08064, 2022.
  19. Physics-informed machine learning. Nature Reviews Physics, 3:422–440, 2021.
  20. Characterizing possible failure modes in physics-informed neural networks. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, volume 34, pages 26548–26560. Curran Associates, Inc., 2021.
  21. Sobolev acceleration and statistical optimality for learning elliptic equations via gradient descent. arXiv:2205.07331, 2022.
  22. S. Mishra and R. Molinaro. Estimates on the generalization error of physics-informed neural networks for approximating PDEs. IMA Journal of Numerical Analysis, 43:1–43, 2023.
  23. Convergence rates for penalised least squares estimators in PDE constrained regression problems. SIAM/ASA Journal on Uncertainty Quantification, 8:374–413, 2020.
  24. Error analysis of physics-informed neural networks for approximating dynamic PDEs of second order in time. arxiv:2303.12245, 2023.
  25. R. Rai and C.K. Sahu. Driven by data or derived through physics? A review of hybrid physics guided machine learning techniques with cyber-physical system (CPS) focus. IEEE Access, 8:71050–71073, 2020.
  26. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
  27. A data-driven multi-fidelity physics-informed learning framework for smart manufacturing: A composites processing case study. In 2022 IEEE 5th International Conference on Industrial Cyber-Physical Systems (ICPS), pages 01–07. IEEE, 2022.
  28. M. Renardy and R.C. Rogers. An Introduction to Partial Differential Equations. Springer, New York, 2004.
  29. Data-driven inference of the mechanics of slip along glacier beds using physics-informed neural networks: Case study on Rutford Ice Stream, Antarctica. Journal of Advances in Modeling Earth Systems, 13:e2021MS002621, 2021.
  30. An operator preconditioning perspective on training in physics-informed machine learning. arXiv:2310.05801, 2023.
  31. R. Schaback and H. Wendland. Kernel techniques: From machine learning to meshless methods. Acta Numerica, 15:543–639, 2006.
  32. Y. Shin. On the convergence of physics informed neural networks for linear second-order elliptic and parabolic type PDEs. Communications in Computational Physics, 28:2042–2074, 2020.
  33. Error estimates of residual minimization using neural networks for linear PDEs. Journal of Machine Learning for Modeling and Computing, 4:73–101, 2023.
  34. E.M. Stein. Singular Integrals and Differentiability Properties of Functions, volume 30 of Princeton Mathematical Series. Princeton University Press, Princeton, 1970.
  35. M.E. Taylor. Partial Differential Equations I. Springer, New York, 2 edition, 2010.
  36. R. Temam. Navier–Stokes Equations and Nonlinear Functional Analysis. SIAM, Philadelphia, 2 edition, 1995.
  37. A.B. Tsybakov. Introduction to Nonparametric Estimation. Springer, New York, 2009.
  38. Physics-informed neural network super resolution for advection-diffusion models. In Third Workshop on Machine Learning and the Physical Sciences (NeurIPS 2020), 2020a.
  39. Towards physics-informed deep learning for turbulent flow prediction. In Proceedings of the International Conference on Knowledge Discovery & Data Mining, pages 1457–1466, 2020b.
  40. Convergence of physics-informed neural networks applied to linear second-order elliptic interface problems. arXiv:2203.03407, 2022.
  41. How neural networks extrapolate: From feedforward to graph neural networks. In International Conference on Learning Representations, 2021.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets