Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AI-Lorenz: A physics-data-driven framework for black-box and gray-box identification of chaotic systems with symbolic regression (2312.14237v1)

Published 21 Dec 2023 in physics.comp-ph, cs.LG, nlin.CD, and physics.data-an

Abstract: Discovering mathematical models that characterize the observed behavior of dynamical systems remains a major challenge, especially for systems in a chaotic regime. The challenge is even greater when the physics underlying such systems is not yet understood, and scientific inquiry must solely rely on empirical data. Driven by the need to fill this gap, we develop a framework that learns mathematical expressions modeling complex dynamical behaviors by identifying differential equations from noisy and sparse observable data. We train a small neural network to learn the dynamics of a system, its rate of change in time, and missing model terms, which are used as input for a symbolic regression algorithm to autonomously distill the explicit mathematical terms. This, in turn, enables us to predict the future evolution of the dynamical behavior. The performance of this framework is validated by recovering the right-hand sides and unknown terms of certain complex, chaotic systems such as the well-known Lorenz system, a six-dimensional hyperchaotic system, and the non-autonomous Sprott chaotic system, and comparing them with their known analytical expressions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. R. Rico-Martinez, J. Anderson, and I. Kevrekidis, “Continuous-time nonlinear signal processing: a neural network based approach for gray box identification,” in Proceedings of IEEE Workshop on Neural Networks for Signal Processing, pp. 596–605, IEEE, 1994.
  2. P. García, “A machine learning based control of chaotic systems,” Chaos, Solitons & Fractals, vol. 155, p. 111630, 2022.
  3. Q. Yuan, J. Zhang, H. Wang, C. Gu, and H. Yang, “A multi-scale transition matrix approach to chaotic time series,” Chaos, Solitons & Fractals, vol. 172, p. 113589, 2023.
  4. Y. Sun, L. Zhang, and M. Yao, “Chaotic time series prediction of nonlinear systems based on various neural network models,” Chaos, Solitons & Fractals, vol. 175, p. 113971, 2023.
  5. P. T. Clemson and A. Stefanovska, “Discerning non-autonomous dynamics,” Physics Reports, vol. 542, no. 4, pp. 297–368, 2014.
  6. J. Hudson, M. Kube, R. Adomaitis, I. Kevrekidis, A. Lapedes, and R. Farber, “Nonlinear signal processing and system identification: applications to time series from electrochemical reactions,” Chemical Engineering Science, vol. 45, no. 8, pp. 2075–2081, 1990.
  7. K. Krischer, R. Rico-Martínez, I. Kevrekidis, H. Rotermund, G. Ertl, and J. Hudson, “Model identification of a spatiotemporally varying catalytic reaction,” AIChE Journal, vol. 39, no. 1, pp. 89–98, 1993.
  8. I. Kevrekidis, R. Rico-Martinez, R. Ecke, R. Farber, and A. Lapedes, “Global bifurcations in Rayleigh-Bénard convection. Experiments, empirical maps and numerical bifurcation analysis,” Physica D: Nonlinear Phenomena, vol. 71, no. 3, pp. 342–362, 1994.
  9. R. Rico-Martinez, K. Krischer, I. Kevrekidis, M. Kube, and J. Hudson, “Discrete-vs. continuous-time nonlinear signal processing of Cu electrodissolution data,” Chemical Engineering Communications, vol. 118, no. 1, pp. 25–48, 1992.
  10. R. González-García, R. Rico-Martìnez, and I. G. Kevrekidis, “Identification of distributed parameter systems: A neural net based approach,” Computers & chemical engineering, vol. 22, pp. S965–S968, 1998.
  11. A. Zhu, T. Bertalan, B. Zhu, Y. Tang, and I. G. Kevrekidis, “Implementation and (Inverse Modified) Error Analysis for implicitly-templated ODE-nets,” arXiv preprint arXiv:2303.17824, 2023.
  12. T. Cui, T. S. Bertalan, N. Ndahiro, P. Khare, M. Betenbaugh, C. Maranas, and I. G. Kevrekidis, “Data-driven and Physics Informed Modelling of Chinese Hamster Ovary Cell Bioreactors,” arXiv preprint arXiv:2305.03257, 2023.
  13. Y. Yin, V. Le Guen, J. Dona, E. de Bézenac, I. Ayed, N. Thome, and P. Gallinari, “Augmenting physical models with deep networks for complex dynamics forecasting,” Journal of Statistical Mechanics: Theory and Experiment, vol. 2021, no. 12, p. 124012, 2021.
  14. S. Malani, T. S. Bertalan, T. Cui, J. L. Avalos, M. Betenbaugh, and I. G. Kevrekidis, “Some of the variables, some of the parameters, some of the times, with some physics known: Identification with partial information,” arXiv preprint arXiv:2304.14214, 2023.
  15. Y. Li, Z. O’Neill, L. Zhang, J. Chen, P. Im, and J. DeGraw, “Grey-box modeling and application for building energy simulations-A critical review,” Renewable and Sustainable Energy Reviews, vol. 146, p. 111174, 2021.
  16. C. A. Thilker, P. Bacher, H. G. Bergsteinsson, R. G. Junker, D. Cali, and H. Madsen, “Non-linear grey-box modelling for heat dynamics of buildings,” Energy and Buildings, vol. 252, p. 111457, 2021.
  17. R. T. Chen, Y. Rubanova, J. Bettencourt, and D. K. Duvenaud, “Neural ordinary differential equations,” Advances in neural information processing systems, vol. 31, 2018.
  18. A. J. Linot, J. W. Burby, Q. Tang, P. Balaprakash, M. D. Graham, and R. Maulik, “Stabilized neural ordinary differential equations for long-time forecasting of dynamical systems,” Journal of Computational Physics, vol. 474, p. 111838, 2023.
  19. C. Fronk and L. Petzold, “Interpretable polynomial neural ordinary differential equations,” Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 33, no. 4, 2023.
  20. P. Goyal and P. Benner, “Discovery of nonlinear dynamical systems using a Runge-Kutta inspired dictionary-based sparse regression approach,” Proceedings of the Royal Society A, vol. 478, no. 2262, p. 20210883, 2022.
  21. K. Lee, N. Trask, and P. Stinis, “Structure-preserving sparse identification of nonlinear dynamics for data-driven modeling,” in Mathematical and Scientific Machine Learning, pp. 65–80, PMLR, 2022.
  22. S. L. Brunton, J. L. Proctor, and J. N. Kutz, “Discovering governing equations from data by sparse identification of nonlinear dynamical systems,” Proceedings of the national academy of sciences, vol. 113, no. 15, pp. 3932–3937, 2016.
  23. R. Tibshirani, “Regression shrinkage and selection via the lasso,” Journal of the Royal Statistical Society: Series B (Methodological), vol. 58, no. 1, pp. 267–288, 1996.
  24. D. L. Donoho, “Compressed sensing,” IEEE Transactions on information theory, vol. 52, no. 4, pp. 1289–1306, 2006.
  25. K. Champion, B. Lusch, J. N. Kutz, and S. L. Brunton, “Data-driven discovery of coordinates and governing equations,” Proceedings of the National Academy of Sciences, vol. 116, no. 45, pp. 22445–22451, 2019.
  26. J. L. Proctor, S. L. Brunton, B. W. Brunton, and J. Kutz, “Exploiting sparsity and equation-free architectures in complex systems,” The European Physical Journal Special Topics, vol. 223, no. 13, pp. 2665–2684, 2014.
  27. J. Bakarji, K. Champion, J. Nathan Kutz, and S. L. Brunton, “Discovering governing equations from partial measurements with deep delay autoencoders,” Proceedings of the Royal Society A, vol. 479, no. 2276, p. 20230422, 2023.
  28. B. Wei, “Sparse dynamical system identification with simultaneous structural parameters and initial condition estimation,” Chaos, Solitons & Fractals, vol. 165, p. 112866, 2022.
  29. S.-M. Udrescu and M. Tegmark, “AI Feynman: A physics-inspired method for symbolic regression,” Science Advances, vol. 6, no. 16, p. eaay2631, 2020.
  30. C. Cornelio, S. Dash, V. Austel, T. R. Josephson, J. Goncalves, K. L. Clarkson, N. Megiddo, B. El Khadir, and L. Horesh, “Combining data and theory for derivable scientific discovery with AI-Descartes,” Nature Communications, vol. 14, no. 1, p. 1777, 2023.
  31. G. Marra, F. Giannini, M. Diligenti, and M. Gori, “Constraint-based visual generation,” in Artificial Neural Networks and Machine Learning–ICANN 2019: Image Processing: 28th International Conference on Artificial Neural Networks, Munich, Germany, September 17–19, 2019, Proceedings, Part III 28, pp. 565–577, Springer, 2019.
  32. J. Scott, M. Panju, and V. Ganesh, “LGML: logic guided machine learning (student abstract),” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 13909–13910, 2020.
  33. D. Ashok, J. Scott, S. J. Wetzel, M. Panju, and V. Ganesh, “Logic guided genetic algorithms (student abstract),” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 15753–15754, 2021.
  34. N. A. Daryakenari, M. De Florio, K. Shukla, and G. E. Karniadakis, “AI-Aristotle: A Physics-Informed framework for Systems Biology Gray-Box Identification,” arXiv preprint arXiv:2310.01433, 2023.
  35. E. Schiassi, R. Furfaro, C. Leake, M. De Florio, H. Johnston, and D. Mortari, “Extreme theory of functional connections: A fast physics-informed neural network method for solving ordinary and partial differential equations,” Neurocomputing, vol. 457, pp. 334–356, 2021.
  36. M. De Florio, E. Schiassi, and R. Furfaro, “Physics-informed neural networks and functional interpolation for stiff chemical kinetics,” Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 32, no. 6, 2022.
  37. M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,” Journal of Computational physics, vol. 378, pp. 686–707, 2019.
  38. M. Cranmer, “Interpretable machine learning for science with PySR and SymbolicRegression. jl,” arXiv preprint arXiv:2305.01582, 2023.
  39. T. Stephens, “gplearn: Genetic programming in python, with a scikitlearn inspired api. [Online]. Available: https://github.com/trevorstephens/gplearn,” 2015.
  40. N. Boddupalli, T. Matchen, and J. Moehlis, “Symbolic regression via neural networks,” Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 33, no. 8, 2023.
  41. D. Mortari, “The theory of connections: Connecting points,” Mathematics, vol. 5, no. 4, p. 57, 2017.
  42. C. Leake, H. Johnston, and D. Mortari, “The multivariate theory of functional connections: Theory, proofs, and application in partial differential equations,” Mathematics, vol. 8, no. 8, p. 1303, 2020.
  43. M. De Florio, E. Schiassi, A. D’Ambrosio, D. Mortari, and R. Furfaro, “Theory of functional connections applied to linear ODEs subject to integral constraints and linear ordinary integro-differential equations,” Mathematical and Computational Applications, vol. 26, no. 3, p. 65, 2021.
  44. T. Mai and D. Mortari, “Theory of functional connections applied to quadratic and nonlinear programming under equality constraints,” Journal of Computational and Applied Mathematics, vol. 406, p. 113912, 2022.
  45. D. Mortari, “Least-squares solution of linear differential equations,” Mathematics, vol. 5, no. 4, p. 48, 2017.
  46. E. Schiassi, M. De Florio, B. D. Ganapol, P. Picca, and R. Furfaro, “Physics-informed neural networks for the point kinetics equations for nuclear reactor dynamics,” Annals of Nuclear Energy, vol. 167, p. 108833, 2022.
  47. J. R. Koza, “Genetic programming as a means for programming computers by natural selection,” Statistics and computing, vol. 4, pp. 87–112, 1994.
  48. E. N. Lorenz, “Deterministic nonperiodic flow,” Journal of atmospheric sciences, vol. 20, no. 2, pp. 130–141, 1963.
  49. F. Sun, Y. Liu, Q. Wang, and H. Sun, “PiSL: Physics-informed Spline Learning for data-driven identification of nonlinear dynamical systems,” Mechanical Systems and Signal Processing, vol. 191, p. 110165, 2023.
  50. M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Multistep neural networks for data-driven discovery of nonlinear dynamical systems,” arXiv preprint arXiv:1801.01236, 2018.
  51. C. Williams and C. Rasmussen, “Gaussian processes for regression,” Advances in neural information processing systems, vol. 8, 1995.
  52. C. E. Rasmussen, “Gaussian processes in machine learning,” in Summer school on machine learning, pp. 63–71, Springer, 2003.
  53. M. Seeger, “Gaussian processes for machine learning,” International journal of neural systems, vol. 14, no. 02, pp. 69–106, 2004.
  54. MIT press Cambridge, MA, 2006.
  55. N. Galioto and A. A. Gorodetsky, “Bayesian system ID: optimal management of parameter, model, and measurement uncertainty,” Nonlinear Dynamics, vol. 102, no. 1, pp. 241–267, 2020.
  56. L. Yi, W. Xiao, W. Yu, and B. Wang, “Dynamical analysis, circuit implementation and deep belief network control of new six-dimensional hyperchaotic system,” Journal of Algorithms & Computational Technology, vol. 12, no. 4, pp. 361–375, 2018.
  57. J. C. Sprott, “Some simple chaotic flows,” Physical review E, vol. 50, no. 2, p. R647, 1994.
  58. M. Wang, J. Li, X. Zhang, H. H.-C. Iu, T. Fernando, Z. Li, and Y. Zeng, “A novel non-autonomous chaotic system with infinite 2-D lattice of attractors and bursting oscillations,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 68, no. 3, pp. 1023–1027, 2020.
Citations (7)

Summary

We haven't generated a summary for this paper yet.