Koopman operator learning using invertible neural networks (2306.17396v2)
Abstract: In Koopman operator theory, a finite-dimensional nonlinear system is transformed into an infinite but linear system using a set of observable functions. However, manually selecting observable functions that span the invariant subspace of the Koopman operator based on prior knowledge is inefficient and challenging, particularly when little or no information is available about the underlying systems. Furthermore, current methodologies tend to disregard the importance of the invertibility of observable functions, which leads to inaccurate results. To address these challenges, we propose the so-called FlowDMD, aka Flow-based Dynamic Mode Decomposition, that utilizes the Coupling Flow Invertible Neural Network (CF-INN) framework. FlowDMD leverages the intrinsically invertible characteristics of the CF-INN to learn the invariant subspaces of the Koopman operator and accurately reconstruct state variables. Numerical experiments demonstrate the superior performance of our algorithm compared to state-of-the-art methodologies.
- Discovering governing equations from data by sparse identification of nonlinear dynamical systems, Proceedings of the National Academy of Sciences 113 (2016) 3932–3937.
- PDE-Net: Learning PDEs from data, in: Proceedings of the 35th International Conference on Machine Learning, 2018, pp. 3208–3216.
- M. Raissi, Deep hidden physics models: Deep learning of nonlinear partial differential equations, Journal of Machine Learning Research 19 (2018) 1–24.
- Equation discovery for nonlinear dynamical systems: A Bayesian viewpoint, Mechanical Systems and Signal Processing 154 (2021) 107528.
- Integration of neural network-based symbolic regression in deep learning for scientific discovery, IEEE Transactions on Neural Networks and Learning Systems 32 (2021) 4166–4177.
- B. O. Koopman, Hamiltonian systems and transformation in Hilbert space, Proceedings of the National Academy of Sciences 17 (1931) 315–318.
- P. J. Schmid, Dynamic mode decomposition and its variants, Annual Review of Fluid Mechanics 54 (2022) 225–254.
- On dynamic mode decomposition: Theory and applications, Journal of Computational Dynamics 1 (2014) 391–421.
- Sparsity-promoting dynamic mode decomposition, Physics of Fluids 26 (2014) 024103.
- Bayesian dynamic mode decomposition, in: Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, 2017, pp. 2814–2821.
- H. Arbabi, I. Mezic, Ergodic theory, dynamic mode decomposition, and computation of spectral properties of the Koopman operator, SIAM Journal on Applied Dynamical Systems 16 (2017) 2096–2126.
- S. Le Clainche, J. M. Vega, Higher order dynamic mode decomposition, SIAM Journal on Applied Dynamical Systems 16 (2017) 882–925.
- Randomized dynamic mode decomposition, SIAM Journal on Applied Dynamical Systems 18 (2019) 1867–1891.
- Online dynamic mode decomposition for time-varying systems, SIAM Journal on Applied Dynamical Systems 18 (2019) 1586–1609.
- Residual dynamic mode decomposition: robust and verified Koopmanism, Journal of Fluid Mechanics 955 (2023) A21.
- A data–driven approximation of the Koopman operator: Extending dynamic mode decomposition, Journal of Nonlinear Science 25 (2015a) 1307–1346.
- A kernel-based method for data-driven Koopman spectral analysis, Journal of Computational Dynamics 2 (2015b) 247–265.
- S. E. Otto, C. W. Rowley, Linearly recurrent autoencoder networks for learning dynamics, SIAM Journal on Applied Dynamical Systems 18 (2019) 558–593.
- Extended dynamic mode decomposition with dictionary learning: A data-driven adaptive spectral decomposition of the Koopman operator, Chaos: An Interdisciplinary Journal of Nonlinear Science 27 (2017) 103111.
- Learning deep neural network representations for Koopman operators of nonlinear dynamical systems, in: American Control Conference, IEEE, 2019, pp. 4832–4839.
- Learning Koopman invariant subspaces for dynamic mode decomposition, in: Advances in Neural Information Processing Systems, volume 30, 2017, pp. 1130–1140.
- Deep learning for universal linear embeddings of nonlinear dynamics, Nature Communications 9 (2018) 1–10.
- Forecasting sequential data using consistent Koopman autoencoders, in: International Conference on Machine Learning, 2020, pp. 475–485.
- S. Pan, K. Duraisamy, Physics-informed probabilistic learning of linear embeddings of nonlinear dynamics with guaranteed stability, SIAM Journal on Applied Dynamical Systems 19 (2020) 480–509.
- M. Li, L. Jiang, Deep learning nonlinear multiscale dynamic problems using Koopman operator, Journal of Computational Physics 446 (2021) 110660.
- Characterizing and correcting for the effect of sensor noise in the dynamic mode decomposition, Experiments in Fluids 57 (2016) 42.
- Koopman neural operator forecaster for time-series with temporal distributional shifts, in: The Eleventh International Conference on Learning Representations, 2023.
- Deep learning enhanced dynamic mode decomposition, Chaos: An Interdisciplinary Journal of Nonlinear Science 32 (2022) 033116.
- Learning the Koopman eigendecomposition: A diffeomorphic approach, in: American Control Conference, IEEE, 2022, pp. 2736–2741.
- H. Lu, D. M. Tartakovsky, Prediction accuracy of dynamic mode decomposition, SIAM Journal on Scientific Computing 42 (2020) A1639–A1662.
- Normalizing flows for probabilistic modeling and inference, Journal of Machine Learning Research 22 (2021) 1–64.
- Normalizing flows: An introduction and review of current methods, IEEE Transactions on Pattern Analysis and Machine Intelligence 43 (2021) 3964–3979.
- Nice: Non-linear independent components estimation, arXiv preprint arXiv:1410.8516 (2014).
- Density estimation using real NVP, in: International Conference on Learning Representations, 2017.
- D. P. Kingma, P. Dhariwal, Glow: Generative flow with invertible 1x1 convolutions, in: Advances in Neural Information Processing Systems, volume 31, 2018, pp. 10215–10224.
- The reversible residual network: Backpropagation without storing activations, in: Advances in Neural Information Processing Systems, volume 30, 2017, pp. 2214–2224.
- Dolfin: A C++/Python finite element library, in: Automated Solution of Differential Equations by the Finite Element Method: The FEniCS Book, Springer, 2012, pp. 173–225.
- Pydmd: Python dynamic mode decomposition, Journal of Open Source Software 3 (2018) 530.
- Pytorch: An imperative style, high-performance deep learning library, in: Advances in Neural Information Processing Systems, volume 32, 2019, pp. 8024–8035.
- X. Glorot, Y. Bengio, Understanding the difficulty of training deep feedforward neural networks, in: Proceedings of the thirteenth International Conference on Artificial Intelligence and Statistics, 2010, pp. 249–256.
- D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, in: International Conference on Learning Representations, 2015.
- S. Ruder, An overview of gradient descent optimization algorithms, CoRR abs/1609.04747 (2016).
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations, Journal of Computational physics 378 (2019) 686–707.
- When and why PINNs fail to train: A neural tangent kernel perspective, Journal of Computational Physics 449 (2022) 110768.
Collections
Sign up for free to add this paper to one or more collections.