An analysis of Universal Differential Equations for data-driven discovery of Ordinary Differential Equations (2306.10335v1)
Abstract: In the last decade, the scientific community has devolved its attention to the deployment of data-driven approaches in scientific research to provide accurate and reliable analysis of a plethora of phenomena. Most notably, Physics-informed Neural Networks and, more recently, Universal Differential Equations (UDEs) proved to be effective both in system integration and identification. However, there is a lack of an in-depth analysis of the proposed techniques. In this work, we make a contribution by testing the UDE framework in the context of Ordinary Differential Equations (ODEs) discovery. In our analysis, performed on two case studies, we highlight some of the issues arising when combining data-driven approaches and numerical solvers, and we investigate the importance of the data collection process. We believe that our analysis represents a significant contribution in investigating the capabilities and limitations of Physics-informed Machine Learning frameworks.
- Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
- Structural identification with physics-informed neural ordinary differential equations. Journal of Sound and Vibration, 508:116196, 2021.
- Two-stage approach to parameter estimation of differential equations using neural odes. Industrial & Engineering Chemistry Research, 60(45):16330–16344, 2021.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
- Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems. Journal of Computational Physics, 397:108850, 2019.
- Universal differential equations for scientific machine learning. arXiv preprint arXiv:2001.04385, 2020.
- Stochastic physics-informed neural ordinary differential equations. Journal of Computational Physics, 468:111466, nov 2022.
- Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
- Physics-guided neural networks (PGNN): an application in lake temperature modeling. CoRR, abs/1710.11431, 2017.
- Physics guided rnns for modeling dynamical systems: A case study in simulating lake temperature profiles. In Proceedings of the 2019 SIAM International Conference on Data Mining, pages 558–566. SIAM, 2019.
- Enforcing analytic constraints in neural networks emulating physical systems. Physical Review Letters, 126(9):098302, 2021.
- Equations of motion from a data series. Complex Syst., 1(3), 1987.
- Automated reverse engineering of nonlinear dynamical systems. Proc. Natl. Acad. Sci. USA, 104(24):9943–9948, 2007.
- Distilling free-form natural laws from experimental data. Science, 324(5923):81–85, 2009.
- A unified sparse optimization framework to learn parsimonious physics-informed models from data. IEEE Access, 8:169259–169271, 2020.
- Data-driven discovery of partial differential equations. Science Advances, 3(4):e1602614, 2017.
- Schaeffer Hayden. Learning partial differential equations via data discovery and sparse optimization. Proc. R. Soc. A.4732016044620160446, 2017.
- Extracting interpretable physical parameters from spatiotemporal systems using unsupervised learning. Physical Review X, 10(3):031056, 2020.
- Latent ordinary differential equations for irregularly-sampled time series. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
- Neural controlled differential equations for irregular time series. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 6696–6707. Curran Associates, Inc., 2020.
- Adam: A Method for Stochastic Optimization. In ICLR, 2015.
- A stochastic approximation method. The annals of mathematical statistics, pages 400–407, 1951.
- Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics, pages 249–256. JMLR Workshop and Conference Proceedings, 2010.