Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generalized Lagrangian Neural Networks (2401.03728v2)

Published 8 Jan 2024 in math.DS, cs.LG, cs.NA, and math.NA

Abstract: Incorporating neural networks for the solution of Ordinary Differential Equations (ODEs) represents a pivotal research direction within computational mathematics. Within neural network architectures, the integration of the intrinsic structure of ODEs offers advantages such as enhanced predictive capabilities and reduced data utilization. Among these structural ODE forms, the Lagrangian representation stands out due to its significant physical underpinnings. Building upon this framework, Bhattoo introduced the concept of Lagrangian Neural Networks (LNNs). Then in this article, we introduce a groundbreaking extension (Genralized Lagrangian Neural Networks) to Lagrangian Neural Networks (LNNs), innovatively tailoring them for non-conservative systems. By leveraging the foundational importance of the Lagrangian within Lagrange's equations, we formulate the model based on the generalized Lagrange's equation. This modification not only enhances prediction accuracy but also guarantees Lagrangian representation in non-conservative systems. Furthermore, we perform various experiments, encompassing 1-dimensional and 2-dimensional examples, along with an examination of the impact of network parameters, which proved the superiority of Generalized Lagrangian Neural Networks(GLNNs).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (39)
  1. Lagrangian neural networks. arXiv preprint arXiv:2003.04630, 2020.
  2. Ruggero Maria Santilli. Foundations of Theoretical Mechanics I: the inverse problem in Newtonian mechanics. Springer Science & Business Media, 2013.
  3. RM Santilli. Foundations of theoretical mechanics ii. birkhoffian generalization of hamiltonian mechanics. 1982.
  4. Hollis Williams. A compound double pendulum with friction. Forces in Mechanics, 10:100164, 2023.
  5. Lagrangian descriptions of dissipative systems: a review. Mathematics and Mechanics of Solids, 26(6):785–803, 2021.
  6. Unsupervised learning of lagrangian dynamics from images for prediction and control. Advances in Neural Information Processing Systems, 33:10741–10752, 2020.
  7. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  8. Symplectic recurrent neural networks. arXiv preprint arXiv:1909.13334, 2019.
  9. Sympnets: Intrinsic structure-preserving symplectic networks for identifying hamiltonian systems. Neural Networks, 132:166–179, 2020.
  10. Symplectic neural networks in taylor series form for hamiltonian systems. Journal of Computational Physics, 437:110325, 2021.
  11. Hamiltonian neural networks. Advances in neural information processing systems, 32, 2019.
  12. Discovering symbolic models from deep learning with inductive biases. Advances in Neural Information Processing Systems, 33:17429–17442, 2020.
  13. A general framework for structured learning of mechanical systems. arXiv preprint arXiv:1902.08705, 2019.
  14. Onsagernet: Learning stable and interpretable dynamics using a generalized onsager principle. Physical Review Fluids, 6(11):114402, 2021.
  15. Deep lagrangian networks: Using physics as model prior for deep learning. arXiv preprint arXiv:1907.04490, 2019.
  16. Hamiltonian graph networks with ode integrators. arXiv preprint arXiv:1909.12790, 2019.
  17. The graph neural network model. IEEE transactions on neural networks, 20(1):61–80, 2008.
  18. Hamiltonian generative networks. arXiv preprint arXiv:1909.13789, 2019.
  19. Advances in neural information processing systems. Advances in neural information processing systems, 32, 2019.
  20. Learning symmetries of classical integrable systems. arXiv preprint arXiv:1906.04645, 2019.
  21. Machine learning of accurate energy-conserving molecular force fields. Science advances, 3(5):e1603015, 2017.
  22. End-to-end differentiable physics for learning and control. Advances in neural information processing systems, 31, 2018.
  23. High-dimensional neural network potentials for organic reactions and an improved training algorithm. Journal of chemical theory and computation, 11(5):2187–2198, 2015.
  24. Modern condensed matter physics. Cambridge University Press, 2019.
  25. The reversible residual network: Backpropagation without storing activations. Advances in neural information processing systems, 30, 2017.
  26. Neuroanimator: Fast neural network emulation and control of physics-based models. In Proceedings of the 25th annual conference on Computer graphics and interactive techniques, pages 9–20, 1998.
  27. Learning latent dynamics for planning from pixels. In International conference on machine learning, pages 2555–2565. PMLR, 2019.
  28. Discovering physical concepts with neural networks. Physical review letters, 124(1):010508, 2020.
  29. i-revnet: Deep invertible networks. arXiv preprint arXiv:1802.07088, 2018.
  30. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25, 2012.
  31. Reversible recurrent neural networks. Advances in Neural Information Processing Systems, 31, 2018.
  32. Simultaneous fitting of a potential-energy surface and its corresponding force fields using feedforward neural networks. The Journal of chemical physics, 130(13), 2009.
  33. Prediction of dynamical systems by symbolic regression. Physical Review E, 94(1):012214, 2016.
  34. Alex Sherstinsky. Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network. Physica D: Nonlinear Phenomena, 404:132306, 2020.
  35. Galerkin projection methods for solving multiple linear systems. SIAM Journal on Scientific Computing, 21(3):836–850, 1999.
  36. Characterizing possible failure modes in physics-informed neural networks. Advances in Neural Information Processing Systems, 34:26548–26560, 2021.
  37. A physics-informed deep learning closure for lagrangian velocity gradient evolution. Physics of Fluids, 35(11), 2023.
  38. Achieving constraints in neural networks: A stochastic augmented lagrangian approach. arXiv preprint arXiv:2310.16647, 2023.
  39. A lagrangian framework for learning in graph neural networks. In Artificial Intelligence in the Age of Neural Networks and Brain Computing, pages 343–365. Elsevier, 2024.
Citations (1)

Summary

We haven't generated a summary for this paper yet.