Catch-22s of reservoir computing (2210.10211v3)
Abstract: Reservoir Computing (RC) is a simple and efficient model-free framework for forecasting the behavior of nonlinear dynamical systems from data. Here, we show that there exist commonly-studied systems for which leading RC frameworks struggle to learn the dynamics unless key information about the underlying system is already known. We focus on the important problem of basin prediction -- determining which attractor a system will converge to from its initial conditions. First, we show that the predictions of standard RC models (echo state networks) depend critically on warm-up time, requiring a warm-up trajectory containing almost the entire transient in order to identify the correct attractor. Accordingly, we turn to Next-Generation Reservoir Computing (NGRC), an attractive variant of RC that requires negligible warm-up time. By incorporating the exact nonlinearities in the original equations, we show that NGRC can accurately reconstruct intricate and high-dimensional basins of attraction, even with sparse training data (e.g., a single transient trajectory). Yet, a tiny uncertainty in the exact nonlinearity can render prediction accuracy no better than chance. Our results highlight the challenges faced by data-driven methods in learning the dynamics of multistable systems and suggest potential avenues to make these approaches more robust.
- W. Maass, T. Natschläger, and H. Markram, Real-time computing without stable states: A new framework for neural computation based on perturbations, Neural Comput. 14, 2531 (2002).
- H. Jaeger and H. Haas, Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication, Science 304, 78 (2004).
- M. Lukoševičius and H. Jaeger, Reservoir computing approaches to recurrent neural network training, Comput. Sci. Rev. 3, 127 (2009).
- D. Canaday, A. Griffith, and D. J. Gauthier, Rapid time series prediction with a hardware-based reservoir computer, Chaos 28, 123119 (2018).
- T. L. Carroll, Using reservoir computers to distinguish chaotic signals, Phys. Rev. E 98, 052209 (2018).
- G. A. Gottwald and S. Reich, Combining machine learning and data assimilation to forecast dynamical systems from noisy partial observations, Chaos 31, 101103 (2021).
- K. Nakajima and I. Fischer, Reservoir Computing (Springer, 2021).
- Z. Lu, B. R. Hunt, and E. Ott, Attractor reconstruction by machine learning, Chaos 28, 061104 (2018).
- L. Grigoryeva, A. Hart, and J.-P. Ortega, Learning strange attractors with reservoir systems, Nonlinearity 36, 4674 (2023).
- A. Röhm, D. J. Gauthier, and I. Fischer, Model-free inference of unseen attractors: Reconstructing phase space features from a single noisy trajectory using reservoir computing, Chaos 31, 103127 (2021).
- D. Patel and E. Ott, Using machine learning to anticipate tipping points and extrapolate to post-tipping dynamics of non-stationary dynamical systems, Chaos 33 (2023).
- T. L. Carroll and L. M. Pecora, Network structure effects in reservoir computers, Chaos 29, 083130 (2019).
- J. Jiang and Y.-C. Lai, Model-free prediction of spatiotemporal dynamical systems with recurrent neural networks: Role of network spectral radius, Phys. Rev. Res. 1, 033056 (2019).
- L. Gonon and J.-P. Ortega, Reservoir computing universality with stochastic inputs, IEEE Trans. Neural Netw. Learn. Syst. 31, 100 (2019).
- A. Griffith, A. Pomerance, and D. J. Gauthier, Forecasting chaotic systems with very low connectivity reservoir computers, Chaos 29, 123108 (2019).
- T. L. Carroll, Do reservoir computers work best at the edge of chaos?, Chaos 30, 121109 (2020).
- A. G. Hart, J. L. Hook, and J. H. Dawes, Echo state networks trained by Tikhonov least squares are L2 (μ𝜇\muitalic_μ) approximators of ergodic dynamical systems, Physica D 421, 132882 (2021).
- A. Flynn, V. A. Tsachouridis, and A. Amann, Multifunctionality in a reservoir computer, Chaos 31, 013125 (2021).
- T. L. Carroll, Optimizing memory in reservoir computers, Chaos 32, 023123 (2022).
- E. Bollt, On explaining the surprising success of reservoir computing forecaster of chaos? The universal machine learning dynamical system with contrast to VAR and DMD, Chaos 31, 013108 (2021).
- D. J. Gauthier, I. Fischer, and A. Röhm, Learning unseen coexisting attractors, Chaos 32 (2022).
- J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities., Proc. Natl. Acad. Sci. U.S.A. 79, 2554 (1982).
- A. E. Teschendorff and A. P. Feinberg, Statistical mechanics meets single-cell biology, Nat. Rev. Genet. 22, 459 (2021).
- Note that the warm-up time series is different from the training data and is only used after training has been completed.
- M. Lukoševičius, A practical guide to applying echo state networks, Neural Networks: Tricks of the Trade: Second Edition , 659 (2012).
- L. Jaurigue and K. Lüdge, Connecting reservoir computing with statistical forecasting and deep neural networks, Nat. Commun. 13, 227 (2022).
- S. L. Brunton, J. L. Proctor, and J. N. Kutz, Discovering governing equations from data by sparse identification of nonlinear dynamical systems, Proc. Natl. Acad. Sci. U.S.A. 113, 3932 (2016).
- A. Rahimi and B. Recht, Random features for large-scale kernel machines, NeurIPS 20 (2007).
- S. Shahi, F. H. Fenton, and E. M. Cherry, Prediction of chaotic time series using recurrent neural networks and reservoir computing techniques: A comparative study, Machine learning with applications 8, 100300 (2022).
- J. C. Butcher, Numerical methods for ordinary differential equations (John Wiley & Sons, 2016).
- We did not observe any other attractors other than the three ground-truth fixed points and infinity for all NGRC models considered. The absence of more complicated attractors (compared to RC) is likely due to the simpler architecture of NGRC and the dissipativity of the real dynamics (which NGRC models can learn directly via the linear features).
- D. A. Wiley, S. H. Strogatz, and M. Girvan, The size of the sync basin, Chaos 16, 015103 (2006).
- R. Delabays, M. Tyloo, and P. Jacquod, The size of the sync basin revisited, Chaos 27, 103109 (2017).
- Y. Zhang and S. H. Strogatz, Basins with tentacles, Phys. Rev. Lett. 127, 194101 (2021).
- E. Weinan, A proposal on machine learning via dynamical systems, Commun. Math. Stat. 1, 1 (2017).
- W. Gilpin, Deep reconstruction of strange attractors from time series, NeurIPS 33, 204 (2020).
- N. H. Nelsen and A. M. Stuart, The random feature model for input-output maps between banach spaces, SIAM J. Sci. Comput. 43, A3212 (2021).
- M. Belkin, Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation, Acta Numerica 30, 203 (2021).
- M. Levine and A. Stuart, A framework for machine learning of model error in dynamical systems, Commun. Am. Math. Soc. 2, 283 (2022).
- Our source code can be found at https://github.com/spcornelius/RCBasins.