Attractor reconstruction with reservoir computers: The effect of the reservoir's conditional Lyapunov exponents on faithful attractor reconstruction (2401.00885v2)
Abstract: Reservoir computing is a machine learning framework that has been shown to be able to replicate the chaotic attractor, including the fractal dimension and the entire Lyapunov spectrum, of the dynamical system on which it is trained. We quantitatively relate the generalized synchronization dynamics of a driven reservoir during the training stage to the performance of the trained reservoir computer at the attractor reconstruction task. We show that, in order to obtain successful attractor reconstruction and Lyapunov spectrum estimation, the largest conditional Lyapunov exponent of the driven reservoir must be significantly more negative than the most negative Lyapunov exponent of the target system. We also find that the maximal conditional Lyapunov exponent of the reservoir depends strongly on the spectral radius of the reservoir adjacency matrix, and therefore, for attractor reconstruction and Lyapunov spectrum estimation, small spectral radius reservoir computers perform better in general. Our arguments are supported by numerical examples on well-known chaotic systems.
- Henry Abarbanel. Analysis of observed chaotic data. Springer Science & Business Media, 2012.
- Nonlinear time series analysis, volume 7. Cambridge University Press, 2004.
- Data-driven science and engineering: Machine learning, dynamical systems, and control. Cambridge University Press, 2022.
- Herbert Jaeger. Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the” echo state network” approach. German National Research Institute for Computer Science (GMD) Report No. 159, 2002.
- Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science, 304(5667):78–80, 2004.
- Using machine learning to replicate chaotic attractors and calculate lyapunov exponents from data. Chaos, 27(12), 2017.
- Dynamical system analysis of a data-driven model constructed by reservoir computing. Phys. Rev. E, 104(4):044215, 2021.
- Robust forecasting using predictive generalized synchronization in reservoir computing. Chaos, 31(12), 2021.
- Ergodic theory of chaos and strange attractors. Rev. Mod. Phys., 57(3):617, 1985.
- Measurement of the lyapunov spectrum from a chaotic time series. Phys. Rev. Lett., 55(10):1082, 1985.
- Estimating the lyapunov-exponent spectrum from short time series of low precision. Phys. Rev. Lett., 66(25):3229, 1991.
- A systematic exploration of reservoir computing for forecasting complex spatiotemporal dynamics. Neural Networks, 153:530–552, 2022.
- Model-free prediction of spatiotemporal dynamical systems with recurrent neural networks: Role of network spectral radius. Phys. Rev. Research, 1(3):033056, 2019.
- Multifunctionality in a reservoir computer. Chaos, 31(1), 2021.
- Dimension increase in filtered chaotic signals. Phys. Rev. Lett., 60(11):979, 1988.
- Discontinuous and nondifferentiable functions and dimension increase induced by filtering chaotic data. Chaos, 6(3):432–439, 1996.
- Reservoir computing approaches to recurrent neural network training. Computer Sci. Rev., 3(3):127–149, 2009.
- Reservoir computing trends. KI-Künstliche Intelligenz, 26:365–371, 2012.
- Numerical methods for the solution of ill-posed problems, volume 328. Springer Science & Business Media, 1995.
- Learning unseen coexisting attractors. Chaos, 32(11), 2022.
- Learn one size to infer all: Exploiting translational symmetries in delay-dynamical and spatiotemporal systems using scalable neural networks. Phys. Rev. E, 106(4):044211, 2022.
- Reservoir computing as digital twins for nonlinear dynamical systems. Chaos, 33(3), 2023.
- Using machine learning to anticipate tipping points and extrapolate to post-tipping dynamics of non-stationary dynamical systems. Chaos, 33(2), 2023.
- Using machine learning to assess short term causal dependence and infer network links. Chaos, 29(12), 2019.
- Machine learning link inference of noisy delay-coupled networks with optoelectronic experimental tests. Phys. Rev. X, 11(3):031014, 2021.
- Synchronizing chaos using reservoir computing. Chaos, 33:103121, 2023.
- Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronization and cryptography. Phys. Rev. E, 98(1):012215, 2018.
- Model-free control of dynamical systems with deep reservoir computing. Journal of Physics: Complexity, 2(3):035025, 2021.
- Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach. Phys. Rev. Lett., 120(2):024102, 2018.
- Machine-learning inference of fluid variables from data using reservoir computing. Phys. Rev. E, 98(2):023111, 2018.
- A hybrid atmospheric model incorporating machine learning can capture dynamical processes not captured by its physics-based component. Geophys. Res. Lett., 50(8):e2022GL102649, 2023.
- Machine learning prediction of the MJO extends beyond one month. arXiv preprint arXiv:2301.01254, 2022.
- Reservoir time series analysis: Using the response of complex dynamical systems as a universal indicator of change. Chaos, 32(3), 2022.
- Network inference from short, noisy, low time-resolution, partial measurements: Application to c. elegans neuronal calcium dynamics. Proceedings of the National Academy of Sciences, 120(12):e2216030120, 2023.
- Attractor reconstruction by machine learning. Chaos, 28(6), 2018.
- Generalized synchronization of chaos in directionally coupled chaotic systems. Phys. Rev. E, 51(2):980, 1995.
- Generalized synchronization, predictability, and equivalence of unidirectionally coupled dynamical systems. Phys. Rev. Lett., 76(11):1816, 1996.
- Generalized synchronization of chaos: The auxiliary system approach. Phys. Rev. E, 53(5):4528, 1996.
- Driving systems with chaotic signals. Phys. Rev. A, 44(4):2374, 1991.
- Consistency of nonlinear system response to complex drive signals. Phys. Rev. Lett., 93(24):244102, 2004.
- The reservoir’s perspective on generalized synchronization. Chaos, 29(9), 2019.
- Embedding and approximation theorems for echo state networks. Neural Networks, 128:234–247, 2020.
- Using reservoir computer to predict and prevent extreme events. Phys. Lett. A, 384(24):126591, 2020.
- Chaos on compact manifolds: Differentiable synchronizations beyond the takens theorem. Phys. Rev. E, 103(6):062204, 2021.
- Photonic information processing beyond turing: an optoelectronic implementation of reservoir computing. Opt. Express, 20(3):3241–3249, 2012.
- Laser dynamical reservoir computing with consistency: an approach of a chaos mask signal. Opt. Express, 24(8):8679–8692, 2016.
- Consistency in echo-state networks. Chaos, 29(2), 2019.
- JL Kaplan and JA Yorke. Functional differential equations and approximation of fixed points. Lecture notes in mathematics, 730:204–227, 1979.
- Thomas L Carroll. Dimension of reservoir computers. Chaos, 30(1), 2020.
- Seeing double with a multifunctional reservoir computer. arXiv preprint arXiv:2305.05799, 2023.
- On the quantification of dynamics in reservoir computing. In International Conference on Artificial Neural Networks, pages 985–994. Springer, 2009.
- Joseph D Hart. Estimating the master stability function from the time series of one oscillator via reservoir computing. Phys. Rev. E, 108(3):L032201, 2023.
- Edward N Lorenz. Deterministic nonperiodic flow. J. Atmos. Sci., 20(2):130–141, 1963.
- William Robert Story. Application of Lyapunov exponents to strange attractors and intact & damaged ship stability. PhD thesis, Virginia Tech, 2009.
- Model-free inference of unseen attractors: Reconstructing phase space features from a single noisy trajectory using reservoir computing. Chaos, 31(10), 2021.
- Rebecca L Honeycutt. Stochastic Runge-Kutta algorithms. i. white noise. Phys. Rev. A, 45(2):600, 1992.
- Chris M Bishop. Training with noise is equivalent to tikhonov regularization. Neural Comput., 7(1):108–116, 1995.
- Stabilizing machine learning prediction of dynamics: Novel noise-inspired regularization tested with reservoir computing. Neural Networks, 170:94–110, 2024.
- On a four-dimensional chaotic system. Chaos, Solitons & Fractals, 23(5):1671–1682, 2005.