Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variability of echo state network prediction horizon for partially observed dynamical systems (2306.10797v3)

Published 19 Jun 2023 in eess.SY, cs.LG, cs.SY, and math.DS

Abstract: Study of dynamical systems using partial state observation is an important problem due to its applicability to many real-world systems. We address the problem by studying an echo state network (ESN) framework with partial state input with partial or full state output. Application to the Lorenz system and Chua's oscillator (both numerically simulated and experimental systems) demonstrate the effectiveness of our method. We show that the ESN, as an autonomous dynamical system, is capable of making short-term predictions up to a few Lyapunov times. However, the prediction horizon has high variability depending on the initial condition-an aspect that we explore in detail using the distribution of the prediction horizon. Further, using a variety of statistical metrics to compare the long-term dynamics of the ESN predictions with numerically simulated or experimental dynamics and observed similar results, we show that the ESN can effectively learn the system's dynamics even when trained with noisy numerical or experimental datasets. Thus, we demonstrate the potential of ESNs to serve as cheap surrogate models for simulating the dynamics of systems where complete observations are unavailable.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (57)
  1. H. Abarbanel, Analysis of observed chaotic data (Springer, 1996).
  2. H. Kantz and T. Schreiber, Nonlinear time series analysis, Vol. 7 (Cambridge university press, 2004).
  3. E. Bradley and H. Kantz, “Nonlinear time-series analysis revisited,” Chaos: An Interdisciplinary Journal of Nonlinear Science 25, 097610 (2015).
  4. A. M. Schäfer and H.-G. Zimmermann, “Recurrent neural networks are universal approximators,” International journal of neural systems 17, 253–263 (2007).
  5. K. Hornik, M. Stinchcombe,  and H. White, “Multilayer feedforward networks are universal approximators,” Neural networks 2, 359–366 (1989).
  6. S. Scher and G. Messori, “Generalization properties of feed-forward neural networks trained on lorenz systems,” Nonlinear processes in geophysics 26, 381–399 (2019).
  7. P. A. Watson, “Applying machine learning to improve simulations of a chaotic dynamical system using empirical error correction,” Journal of Advances in Modeling Earth Systems 11, 1402–1417 (2019).
  8. J. Zhang and K. Man, “Time series prediction using rnn in multi-dimension embedding phase space,” in SMC’98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.98CH36218), Vol. 2 (1998) pp. 1868–1873 vol.2.
  9. M. Han, J. Xi, S. Xu,  and F.-L. Yin, “Prediction of chaotic time series based on the recurrent predictor neural network,” IEEE Transactions on Signal Processing 52, 3409–3416 (2004).
  10. M. Sangiorgio and F. Dercole, “Robustness of lstm neural networks for multi-step forecasting of chaotic time series,” Chaos, Solitons & Fractals 139, 110045 (2020).
  11. A. Chattopadhyay, P. Hassanzadeh,  and D. Subramanian, “Data-driven predictions of a multiscale lorenz 96 chaotic system using machine-learning methods: Reservoir computing, artificial neural network, and long short-term memory network,” Nonlinear Processes in Geophysics 27, 373–389 (2020).
  12. W. Gilpin, “Chaos as an interpretable benchmark for forecasting and data-driven modelling,” in Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2) (2021).
  13. H. Jaeger and H. Haas, ‘‘Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication,” science 304, 78–80 (2004).
  14. W. Maass, T. Natschläger,  and H. Markram, “Real-time computing without stable states: A new framework for neural computation based on perturbations,” Neural computation 14, 2531–2560 (2002).
  15. C. R. Vogel, Computational methods for inverse problems (SIAM, 2002).
  16. Y. Bengio, N. Boulanger-Lewandowski,  and R. Pascanu, “Advances in optimizing recurrent networks,” in 2013 IEEE international conference on acoustics, speech and signal processing (IEEE, 2013) pp. 8624–8628.
  17. H. Shrivastava, A. Garg, Y. Cao, Y. Zhang,  and T. Sainath, “Echo state speech recognition,” in ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (IEEE, 2021) pp. 5669–5673.
  18. M. D. Skowronski and J. G. Harris, “Automatic speech recognition using a predictive echo state network classifier,” Neural networks 20, 414–423 (2007).
  19. X. Lin, Z. Yang,  and Y. Song, “Short-term stock price prediction based on echo state networks,” Expert systems with applications 36, 7313–7317 (2009).
  20. T. Kim and B. R. King, “Time series prediction using deep echo state networks,” Neural Computing and Applications 32, 17769–17787 (2020).
  21. M. H. Tong, A. D. Bickett, E. M. Christiansen,  and G. W. Cottrell, “Learning grammatical structure with echo state networks,” Neural networks 20, 424–432 (2007).
  22. N. Schaetti, M. Salomon,  and R. Couturier, “Echo state networks-based reservoir computing for mnist handwritten digits recognition,” in 2016 IEEE Intl Conference on Computational Science and Engineering (CSE) and IEEE Intl Conference on Embedded and Ubiquitous Computing (EUC) and 15th Intl Symposium on Distributed Computing and Applications for Business Engineering (DCABES) (IEEE, 2016) pp. 484–491.
  23. P. L. McDermott and C. K. Wikle, “Deep echo state networks with uncertainty quantification for spatio-temporal forecasting,” Environmetrics 30, e2553 (2019).
  24. P. L. McDermott and C. K. Wikle, “An ensemble quadratic echo state network for non-linear spatio-temporal forecasting,” Stat 6, 315–330 (2017).
  25. S. Løkse, F. M. Bianchi,  and R. Jenssen, “Training echo state networks with regularization through dimensionality reduction,” Cognitive Computation 9, 364–378 (2017).
  26. Z. Shi and M. Han, “Support vector echo-state machine for chaotic time-series prediction,” IEEE transactions on neural networks 18, 359–372 (2007).
  27. S. H. Lim, L. Theo Giorgini, W. Moon,  and J. S. Wettlaufer, “Predicting critical transitions in multiscale dynamical systems using reservoir computing,” Chaos: An Interdisciplinary Journal of Nonlinear Science 30, 123126 (2020).
  28. M. Lukoševičius and H. Jaeger, “Reservoir computing approaches to recurrent neural network training,” Computer science review 3, 127–149 (2009).
  29. R. S. Zimmermann and U. Parlitz, “Observing spatio-temporal dynamics of excitable media using reservoir computing,” Chaos: An Interdisciplinary Journal of Nonlinear Science 28, 043118 (2018).
  30. Z. Lu, B. R. Hunt,  and E. Ott, “Attractor reconstruction by machine learning,” Chaos: An Interdisciplinary Journal of Nonlinear Science 28, 061104 (2018).
  31. D. J. Gauthier, “Reservoir computing: Harnessing a universal dynamical system,” Phys. Rev. Lett 120, 2018 (2018).
  32. B. Schrauwen, D. Verstraeten,  and J. Van Campenhout, “An overview of reservoir computing: theory, applications and implementations,” in Proceedings of the 15th european symposium on artificial neural networks. p. 471-482 2007 (2007) pp. 471–482.
  33. J. Pathak, Z. Lu, B. R. Hunt, M. Girvan,  and E. Ott, “Using machine learning to replicate chaotic attractors and calculate lyapunov exponents from data,” Chaos: An Interdisciplinary Journal of Nonlinear Science 27, 121102 (2017).
  34. A. Haluszczynski and C. Räth, “Good and bad predictions: Assessing and improving the replication of chaotic attractors by means of reservoir computing,” Chaos: An Interdisciplinary Journal of Nonlinear Science 29, 103143 (2019).
  35. P. R. Vlachas, J. Pathak, B. R. Hunt, T. P. Sapsis, M. Girvan, E. Ott,  and P. Koumoutsakos, “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics,” Neural Networks 126, 191–217 (2020).
  36. A. Hart, J. Hook,  and J. Dawes, “Embedding and approximation theorems for echo state networks,” Neural Networks 128, 234–247 (2020).
  37. D. Sussillo and L. F. Abbott, “Generating coherent patterns of activity from chaotic neural networks,” Neuron 63, 544–557 (2009).
  38. O. V. Maslennikov and V. I. Nekorkin, “Collective dynamics of rate neurons for supervised learning in a reservoir computing system,” Chaos: An Interdisciplinary Journal of Nonlinear Science 29 (2019).
  39. O. V. Maslennikov, C. Gao,  and V. I. Nekorkin, “Internal dynamics of recurrent neural networks trained to generate complex spatiotemporal patterns,” Chaos: An Interdisciplinary Journal of Nonlinear Science 33 (2023).
  40. M. Masoliver, J. Davidsen,  and W. Nicola, “Embedded chimera states in recurrent neural networks,” Communications Physics 5, 205 (2022).
  41. C. Klos, Y. F. K. Kossio, S. Goedeke, A. Gilra,  and R.-M. Memmesheimer, “Dynamical learning of dynamics,” Physical Review Letters 125, 088103 (2020).
  42. L. M. Smith, J. Z. Kim, Z. Lu,  and D. S. Bassett, “Learning continuous chaotic attractors with a reservoir computer,” Chaos: An Interdisciplinary Journal of Nonlinear Science 32 (2022).
  43. A. G. Hart, J. L. Hook,  and J. H. Dawes, “Echo state networks trained by tikhonov least squares are l2 (μ𝜇\muitalic_μ) approximators of ergodic dynamical systems,” Physica D: Nonlinear Phenomena 421, 132882 (2021).
  44. K. Nakai and Y. Saiki, “Machine-learning inference of fluid variables from data using reservoir computing,” Physical Review E 98, 023111 (2018).
  45. Y. Song, Y. Li, Q. Wang,  and C. Li, “Multi-steps prediction of chaotic time series based on echo state network,” in 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA) (IEEE, 2010) pp. 669–672.
  46. Z. Lu, J. Pathak, B. Hunt, M. Girvan, R. Brockett,  and E. Ott, “Reservoir observers: Model-free inference of unmeasured variables in chaotic systems,” Chaos: An Interdisciplinary Journal of Nonlinear Science 27, 041102 (2017).
  47. J. Xi, Z. Shi,  and M. Han, “Analyzing the state space property of echo state networks for chaotic system prediction,” in Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005., Vol. 3 (IEEE, 2005) pp. 1412–1417.
  48. H. Jaeger, “The “echo state” approach to analysing and training recurrent neural networks-with an erratum note,” Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148, 13 (2001).
  49. M. Lukoševičius, “A practical guide to applying echo state networks,” Neural Networks: Tricks of the Trade: Second Edition , 659–686 (2012).
  50. A. Racca and L. Magri, “Robust optimization and validation of echo state networks for learning chaotic dynamics,” Neural Networks 142, 252–268 (2021).
  51. E. Bollt, “On explaining the surprising success of reservoir computing forecaster of chaos? the universal machine learning dynamical system with contrast to var and dmd¡? a3b2 show [feature]?¿,” Chaos: An Interdisciplinary Journal of Nonlinear Science 31, 013108 (2021).
  52. J. Pathak, A. Wikner, R. Fussell, S. Chandra, B. R. Hunt, M. Girvan,  and E. Ott, “Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model,” Chaos: An Interdisciplinary Journal of Nonlinear Science 28, 041101 (2018).
  53. M. T. Rosenstein, J. J. Collins,  and C. J. De Luca, “A practical method for calculating largest lyapunov exponents from small data sets,” Physica D: Nonlinear Phenomena 65, 117–134 (1993).
  54. G. A. Gottwald and I. Melbourne, “On the implementation of the 0–1 test for chaos,” SIAM Journal on Applied Dynamical Systems 8, 129–145 (2009).
  55. A. Delgado-Bonal and A. Marshak, “Approximate entropy and sample entropy: A comprehensive tutorial,” Entropy 21, 541 (2019).
  56. P. Vermeesch, “On the visualisation of detrital age distributions,” Chemical Geology 312, 190–194 (2012).
  57. G. A. Gottwald and S. Reich, “Supervised learning from noisy observations: Combining machine-learning techniques with data assimilation,” Physica D: Nonlinear Phenomena 423, 132911 (2021).
Citations (1)

Summary

We haven't generated a summary for this paper yet.