Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 40 tok/s
GPT-5 High 38 tok/s Pro
GPT-4o 101 tok/s
GPT OSS 120B 470 tok/s Pro
Kimi K2 161 tok/s Pro
2000 character limit reached

Spectral-Bias and Kernel-Task Alignment in Physically Informed Neural Networks (2307.06362v2)

Published 12 Jul 2023 in stat.ML, cond-mat.dis-nn, and cs.LG

Abstract: Physically informed neural networks (PINNs) are a promising emerging method for solving differential equations. As in many other deep learning approaches, the choice of PINN design and training protocol requires careful craftsmanship. Here, we suggest a comprehensive theoretical framework that sheds light on this important problem. Leveraging an equivalence between infinitely over-parameterized neural networks and Gaussian process regression (GPR), we derive an integro-differential equation that governs PINN prediction in the large data-set limit -- the neurally-informed equation. This equation augments the original one by a kernel term reflecting architecture choices and allows quantifying implicit bias induced by the network via a spectral decomposition of the source term in the original differential equation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (44)
  1. Explaining scaling laws of neural network generalization, 2022.
  2. Language models are few-shot learners. Advances in neural information processing systems, 33:1877–1901, 2020.
  3. Physics-informed neural networks (pinns) for fluid mechanics: A review. Acta Mechanica Sinica, 37(12):1727–1738, 2021.
  4. Spectral bias and task-model alignment explain generalization in kernel regression and infinitely wide neural networks. Nature communications, 12(1):2914, 2021.
  5. Learning curves for overparametrized deep neural networks: A field theory perspective. Physical Review Research, 3(2):023034, 2021.
  6. Physics-informed neural networks with adaptive localized artificial viscosity. Journal of Computational Physics, page 112265, 2023.
  7. Scientific machine learning through physics–informed neural networks: Where we are and what’s next. Journal of Scientific Computing, 92(3):88, 2022.
  8. State-of-the-art review of design of experiments for physics-informed deep learning. arXiv preprint arXiv:2202.06416, 2022.
  9. Pavel Izmailov et al. pages 4629–4640. PMLR, 2021.
  10. Limitations of physics informed machine learning for nonlinear two-phase transport in porous media. Journal of Machine Learning for Modeling and Computing, 1(1), 2020.
  11. Calibrating constitutive models with full-field data via physics informed neural networks. Strain, 59(2):e12431, 2023.
  12. Bayesian interpolation with deep linear networks. Proceedings of the National Academy of Sciences, 120(23):e2301345120, 2023.
  13. Physics-informed machine learning: A survey on problems, methods and applications. arXiv preprint arXiv:2211.08064, 2022.
  14. Infinite attention: Nngp and ntk for deep attention networks. In International Conference on Machine Learning, pages 4376–4386. PMLR, 2020.
  15. Partial differential equations meet deep neural networks: A survey. arXiv preprint arXiv:2211.05567, 2022.
  16. Neural tangent kernel: Convergence and generalization in neural networks. Advances in neural information processing systems, 31, 2018.
  17. Physics-enhanced deep learning methods for modelling and simulating flow fields. Chinese Journal of Theoretical and Applied Mechanics, 53(10):2616–2629, 2021.
  18. Highly accurate protein structure prediction with alphafold. Nature, 596(7873):583–589, 2021.
  19. Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
  20. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25, 2012.
  21. Deep neural networks as gaussian processes. In International Conference on Learning Representations, 2018.
  22. Statistical mechanics of deep linear neural networks: The backpropagating kernel renormalization. Physical Review X, 11(3):031059, 2021.
  23. Deepxde: A deep learning library for solving differential equations. SIAM review, 63(1):208–228, 2021.
  24. A hybrid physics-informed neural network for nonlinear partial differential equation. International Journal of Modern Physics C, 34(06):2350082, 2023.
  25. A Solvable Model of Neural Scaling Laws. arXiv e-prints, page arXiv:2210.16859, October 2022.
  26. A variational approach to learning curves. Advances in neural information processing systems, 14, 2001.
  27. Stochastic gradient descent as approximate bayesian inference. arXiv preprint arXiv:1704.04289, 2017.
  28. Physics-informed neural networks for high-speed flows. Computer Methods in Applied Mechanics and Engineering, 360:112789, 2020.
  29. Estimating density, velocity, and pressure fields in supersonic flows using physics-informed bos. Experiments in Fluids, 64(1):14, 2023.
  30. Predicting the outputs of finite deep neural networks trained with noisy gradients. Physical Review E, 104(6):064301, 2021.
  31. Advances in Neural Information Processing Systems, 34:21352–21364, 2021.
  32. Bayesian deep convolutional networks with many channels are gaussian processes. arXiv preprint arXiv:1810.05148, 2018.
  33. Physics-informed learning machines for partial differential equations: Gaussian processes versus neural networks. Emerging Frontiers in Nonlinear Science, pages 323–343, 2020.
  34. Thermodynamically consistent physics-informed neural networks for hyperbolic systems. Journal of Computational Physics, 449:110754, 2022.
  35. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  36. Machine learning of linear differential equations using gaussian processes. Journal of Computational Physics, 348:683–693, 2017.
  37. Physics-informed attention-based neural network for hyperbolic partial differential equations: application to the buckley–leverett problem. Scientific reports, 12(1):7557, 2022.
  38. Neural-network-based riemann solver for real fluids and high explosives; application to computational fluid dynamics. Physics of Fluids, 34(11), 2022.
  39. Separation of scales and a thermodynamic description of feature learning in some cnns. Nature Communications, 14(1):908, 2023.
  40. Improved architectures and training algorithms for deep operator networks, 2021.
  41. On the eigenvector bias of fourier feature networks: From regression to solving multi-scale pdes with physics-informed neural networks. Computer Methods in Applied Mechanics and Engineering, 384:113938, 2021.
  42. When and why PINNs fail to train: A neural tangent kernel perspective. Journal of Computational Physics, 449:110768, 2022.
  43. Gaussian processes for machine learning, volume 2. MIT press Cambridge, MA, 2006.
  44. Tuning large neural networks via zero-shot hyperparameter transfer. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, 2021.
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.