Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Estimates on the generalization error of Physics Informed Neural Networks (PINNs) for approximating a class of inverse problems for PDEs (2007.01138v3)

Published 29 Jun 2020 in math.NA, cs.LG, cs.NA, math-ph, math.AP, and math.MP

Abstract: Physics informed neural networks (PINNs) have recently been very successfully applied for efficiently approximating inverse problems for PDEs. We focus on a particular class of inverse problems, the so-called data assimilation or unique continuation problems, and prove rigorous estimates on the generalization error of PINNs approximating them. An abstract framework is presented and conditional stability estimates for the underlying inverse problem are employed to derive the estimate on the PINN generalization error, providing rigorous justification for the use of PINNs in this context. The abstract framework is illustrated with examples of four prototypical linear PDEs. Numerical experiments, validating the proposed theory, are also presented.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (63)
  1. The stability of the cauchy problem for elliptic equations. Inverse problems, 25(12):47, 2009.
  2. Un example dutilization des notions de propagation pour le controle et al stabilisation de prolémes hyperboliques. Rend. Sem. Math. Univ. Politec. Torino, pages 11–31, 1989.
  3. A. R. Barron. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans. Inform. Theory., 39(3):930–945, 1993.
  4. Global carleman estimates for waves and applications. Commun. PDE., 38:556–598, 2013.
  5. Solving stochastic differential equations and kolmogorov equations by means of deep learning. Preprint, available as arXiv:1806.00421v1.
  6. L. Bourgeois. A mixed formulation of quasi-reversibility to solve the cauchy problem for laplace’s equation. Inverse problems, 21(3):1087–1104, 2005.
  7. H. J. Bungartz and M. Griebel. Sparse grids. Acta Numerica, 13:147–269, 2004.
  8. E. Burman. Error estimates for stabilized finite element methods applied to ill-posed problems. C. R. Math. Acad. Sci. Paris, 352(7-8):655–659, 2014.
  9. A finite element data assimilation method for the wave equation. Math. Comp., 89(324):1681–1709, 2020.
  10. E. Burman and P. Hansbo. Stabilized nonconfirming finite element methods for data assimilation in incompressible flows. Math. Comp., 87(311):1029–1050, 2018.
  11. Solving ill-posed control problems by stabilized finite element methods: an alternative to tikhonov regularization. Inverse problems, 34(3), 2018.
  12. Fully discrete finite element data assimilation method for the heat equation. ESIAM: Math. Model. Num. Anal., 52:2065–2082, 2018.
  13. E. Burman and L. Oksanen. Weakly consistent regularization methods for ill-posed problems. In Numerical methods for PDEs, D. A. Di Pietro (eds), pages 171–202. World Scientific, 2018.
  14. R. E. Caflisch. Monte carlo and quasi-monte carlo methods. Acta Numerica, 7:1–49, 1998.
  15. Physics-informed neural networks for inverse problems in nano-optics and metamaterials. Preprint, available from arXiv:1912.01085, 2019.
  16. G. Cybenko. Approximations by superpositions of sigmoidal functions. Approximation theory and its applications., 9(3):17–28, 1989.
  17. Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in Mathematics and Statistics, 5(4):349–380, 2017.
  18. De novo structure prediction with deep-learning based scoring. Annual Review of Biochemistry, 77(363-382):6, 2018.
  19. L. D. F. Regazzoni and A. Quarteroni. Machine learning for fast and reliable solution of time-dependent differential equations. Journal of Computational physics, 397, 2019.
  20. R. Fletcher. Practical methods of optimization. John Wiley and sons, 1987.
  21. Controllability of evolution equations. SNU lecture notes 34, 1996.
  22. Deep learning. MIT press, 2016.
  23. Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences, 115(34):8505–8510, 2018.
  24. Multilayer feedforward networks are universal approximators. Neural networks., 2(5):359–366, 1989.
  25. O. Y. Imanuvilov. Controllability of parabolic equations. Math. Sb., 186(6):109–132, 1995.
  26. O. Y. Imanuvilov. Remarks on exact controllability for the navier-stokes equations. ESIAM: COCV, 6:39–72, 2001.
  27. Extended physics-informed neural networks (xpinns): A generalized space-time domain decomposition based deep learning framework for nonlinear partial differential equations. Communications in Computational Physics, 28(5):2002–2041, 2020.
  28. Adaptive activation functions accelerate convergence in deep and physics-informed neural networks. Journal of Computational Physics, 404:109126, 2020.
  29. Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems. Computer Methods in Applied Mechanics and Engineering, 365:113028, 2020.
  30. D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. In 3rd International Conference on Learning Representations, ICLR 2015, 2015.
  31. P. Kuchment and L. Kunyansky. Mathematics of thermoacoustic tomography. European J. Appl. Math., 2(19):191–224, 2008.
  32. F. Laakmann and P. Petersen. Efficient approximation of solutions of parametric linear transport equations by reludnns. Preprint, 2019.
  33. Neural-network methods for bound- ary value problems with irregular boundaries. IEEE Transactions on Neural Networks, 11:1041–1049, 2000.
  34. Artificial neural networks for solving ordinary and partial differential equations. IEEE Transactions on Neural Networks, 9(5):987–1000, 1998.
  35. R. Lattés and J. L. Lions. Méthode de quasi-réversibilité et applications. Dunod, Paris, 1967.
  36. Deep learning. Nature, 521(7553):436–444, 2015.
  37. Optimal three-ball inequalities and quantitative uniqueness for the stokes system. Discrete Contin. Dyn. Syst., 28(3):1273–1290, 2010.
  38. B-pinns: Bayesian physics-informed neural networks for forward and inverse pde problems with noisy data. Preprint, available from arXiv:2003.06097, 2020.
  39. Deepxde: A deep learning library for solving differential equations. Preprint, available from arXiv:1907.04502, 2019.
  40. Iterative surrogate model optimization (ismo): An active learning algorithm for pde constrained optimization with deep neural networks. Preprint, 2020.
  41. A multi-level procedure for enhancing accuracy of machine learning algorithms. European Journal of Applied Mathematics, 2020.
  42. Deep learning observables in computational fluid dynamics. Journal of Computational Physics, page 109339, 2020.
  43. F. C. M. Badra and J. Darde. Stability estimates for the navier-stokes equations and applications to inverse problems. Discrete Contin. Dyn. Syst. Ser. B, 21(8):2379–2407, 2016.
  44. Physics-informed neural networks for high-speed flows. Computer Methods in Applied Mechanics and Engineering, 360:112789, 2020.
  45. L. Miller. Escape function conditions for the observation, control and stabilization of the wave equation. SIAM J. Control. Opt., 41(5):1554–1566, 2002.
  46. S. Mishra. A machine learning framework for data driven acceleration of computations of differential equations. Mathematics in Engineering, 1:118, 2019.
  47. S. Mishra and R. Molinaro. Estimates on the generalization error of physics informed neural networks (pinns) for approximating pdes. Preprint, available from arXiv:2006:16144v1, 2020.
  48. S. Mishra and T. K. Rusch. Enhancing accuracy of deep learning algorithms by training with low-discrepancy sequences. Preprint, available as arXiv:2005.12564, 2020.
  49. Foundations of machine learning. MIT press, 2018.
  50. A. B. Owen. Multidimensional variation for quasi-monte carlo. In Contemporary Multivariate Analysis And Design Of Experiments: In Celebration of Professor Kai-Tai Fang’s 65th Birthday, pages 49–74. World Scientific, 2005.
  51. fpinns: Fractional physics-informed neural networks. SIAM journal of Scientific computing, 41:A2603–A2626, 2019.
  52. M. Raissi and G. E. Karniadakis. Hidden physics models: Machine learning of nonlinear partial differential equations. Journal of Computational Physics, 357:125–141, 2018.
  53. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
  54. Hidden fluid mechanics: A navier-stokes informed deep learning framework for assimilating flow visualization data. arXiv preprint arXiv:1808.04327, 2018.
  55. Recovering the initial state of an infinite-dimennsional system using observers. Automatica J., 46(10):1616–1625, 2010.
  56. D. Ray and J. S. Hesthaven. An artificial neural network as a troubled-cell indicator. Journal of Computational Physics, 367:166–191, 2018.
  57. Geometric control condition for the wave equation with a time-dependent observation domain. Analysis PDE., 10(4):983–1015, 2017.
  58. G. Seregin. Lecture notes on regularity theory for the Navier-Stokes equations. World Scientific, 2015.
  59. S. Shalev-Shwartz and S. Ben-David. Understanding machine learning: From theory to algorithms. Cambridge University Press, 2014.
  60. On the convergence and generalization of physics informed neural networks. Preprint, available from arXiv:2004.01806v1, 2020.
  61. Physics-informed neural network for ultrasound nondestructive quantification of surface breaking cracks. Journal of Nondestructive Evaluation, 39(3):1–20, 2020.
  62. J. Stoer and R. Bulirsch. Introduction to numerical analysis. Springer Verlag, 2002.
  63. D. Yarotsky. Error bounds for approximations with deep relu networks. Neural Networks, 94:103–114, 2017.
Citations (235)

Summary

We haven't generated a summary for this paper yet.