Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 70 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 37 tok/s Pro
GPT-4o 102 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 466 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

TENG: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets Toward Machine Precision (2404.10771v2)

Published 16 Apr 2024 in cs.LG, physics.comp-ph, cs.NA, and math.NA

Abstract: Partial differential equations (PDEs) are instrumental for modeling dynamical systems in science and engineering. The advent of neural networks has initiated a significant shift in tackling these complexities though challenges in accuracy persist, especially for initial value problems. In this paper, we introduce the $\textit{Time-Evolving Natural Gradient (TENG)}$, generalizing time-dependent variational principles and optimization-based time integration, leveraging natural gradient optimization to obtain high accuracy in neural-network-based PDE solutions. Our comprehensive development includes algorithms like TENG-Euler and its high-order variants, such as TENG-Heun, tailored for enhanced precision and efficiency. TENG's effectiveness is further validated through its performance, surpassing current leading methods and achieving $\textit{machine precision}$ in step-by-step optimizations across a spectrum of PDEs, including the heat equation, Allen-Cahn equation, and Burgers' equation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. Amari, S.-I. Natural gradient works efficiently in learning. Neural computation, 10(2):251–276, 1998.
  2. Randomized sparse neural galerkin schemes for solving evolution equations with deep networks. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  3. Spectral methods: evolution to complex geometries and applications to fluid dynamics. Springer Science & Business Media, 2007.
  4. Solving the quantum many-body problem with artificial neural networks. Science, 355(6325):602–606, 2017.
  5. Implicit neural spatial representations for time-dependent pdes. In International Conference on Machine Learning, pp.  5162–5177. PMLR, 2023a.
  6. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  7. Simulating 2+1d lattice quantum electrodynamics at finite density with neural flow wavefunctions. arXiv preprint arXiv:2212.06835, 2022.
  8. Antn: Bridging autoregressive neural networks and tensor networks for quantum many-body simulation. In Oh, A., Neumann, T., Globerson, A., Saenko, K., Hardt, M., and Levine, S. (eds.), Advances in Neural Information Processing Systems, volume 36, pp.  450–476. Curran Associates, Inc., 2023b.
  9. Dirac, P. A. Note on exchange phenomena in the thomas atom. In Mathematical proceedings of the Cambridge philosophical society, volume 26, pp.  376–385. Cambridge University Press, 1930.
  10. Dynamics with autoregressive neural quantum states: Application to critical quench dynamics. Physical Review A, 108(2), August 2023. ISSN 2469-9934. doi: 10.1103/physreva.108.022210.
  11. Evolutional deep neural network. Physical Review E, 104(4), October 2021. ISSN 2470-0053. doi: 10.1103/physreve.104.045303.
  12. Real time evolution with neural-network quantum states. Quantum, 6:627, 2022.
  13. Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in mathematics and statistics, 5(4):349–380, 2017.
  14. Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences, 115(34):8505–8510, 2018.
  15. Kakade, S. M. A natural policy gradient. Advances in neural information processing systems, 14, 2001.
  16. Dynamical low-rank approximation. SIAM Journal on Matrix Analysis and Applications, 29(2):434–454, 2007.
  17. Variational optimization in the ai era: Computational graph states and supervised wave-function optimization. arXiv preprint arXiv:1811.12423, 2018.
  18. Langley, P. Crafting papers on machine learning. In Langley, P. (ed.), Proceedings of the 17th International Conference on Machine Learning (ICML 2000), pp.  1207–1216, Stanford, CA, 2000. Morgan Kaufmann.
  19. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
  20. Pde-net: Learning pdes from data. In International conference on machine learning, pp.  3208–3216. PMLR, 2018.
  21. Deeponet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. arXiv preprint arXiv:1910.03193, 2019.
  22. Autoregressive neural network for simulating open quantum systems via a probabilistic formulation. Physical review letters, 128(9):090501, 2022.
  23. Gauge-invariant and anyonic-symmetric autoregressive neural network for quantum lattice models. Physical Review Research, 5(1):013216, 2023.
  24. Achieving high accuracy with pinns via energy natural gradient descent. In International Conference on Machine Learning, pp.  25471–25485. PMLR, 2023.
  25. Revisiting natural gradient for deep networks. arXiv preprint arXiv:1301.3584, 2013.
  26. Reinforcement learning for humanoid robotics. In Proceedings of the third IEEE-RAS international conference on humanoid robots, pp.  1–20, 2003.
  27. Learning mesh-based simulation with graph networks. arXiv preprint arXiv:2010.03409, 2020.
  28. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  29. Learning to simulate complex physics with graph networks. In International conference on machine learning, pp.  8459–8468. PMLR, 2020.
  30. Dgm: A deep learning algorithm for solving partial differential equations. Journal of computational physics, 375:1339–1364, 2018.
  31. Quantum natural gradient. Quantum, 4:269, 2020.
  32. Spacetime neural network for high dimensional quantum dynamics. arXiv preprint arXiv:2108.02200, 2021a.
  33. Long-time integration of parametric evolution equations with physics-informed deeponets. Journal of Computational Physics, 475:111855, 2023.
  34. Learning the solution operator of parametric partial differential equations with physics-informed deeponets. Science advances, 7(40):eabi8605, 2021b.
  35. Algorithms for solving high dimensional pdes: from nonlinear monte carlo to machine learning. Nonlinearity, 35(1):278, 2021.
  36. Yu, B. et al. The deep ritz method: a deep learning-based numerical algorithm for solving variational problems. Communications in Mathematics and Statistics, 6(1):1–12, 2018.
  37. Fast convergence of natural gradient descent for over-parameterized neural networks. Advances in Neural Information Processing Systems, 32, 2019.
Citations (1)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube