Thermodynamic limit in learning period three (2405.08825v4)
Abstract: A continuous one-dimensional map with period three includes all periods. This raises the following question: Can we obtain any types of periodic orbits solely by learning three data points? In this paper, we report the answer to be yes. Considering a random neural network in its thermodynamic limit, we first show that almost all learned periods are unstable, and each network has its own characteristic attractors (which can even be untrained ones). The latently acquired dynamics, which are unstable within the trained network, serve as a foundation for the diversity of characteristic attractors and may even lead to the emergence of attractors of all periods after learning. When the neural network interpolation is quadratic, a universal post-learning bifurcation scenario appears, which is consistent with a topological conjugacy between the trained network and the classical logistic map. In addition to universality, we explore specific properties of certain networks, including the singular behavior of the scale of weight at the infinite limit, the finite-size effects, and the symmetry in learning period three.
- K. Nakajima and I. Fischer, Reservoir Computing (Springer, Singapore, 2021).
- H. Jaeger, The “echo state” approach to analysing and training recurrent neural networks-with an erratum note, Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148, 13 (2001).
- Z. Lu, B. R. Hunt, and E. Ott, Attractor reconstruction by machine learning, Chaos: An Interdisciplinary Journal of Nonlinear Science 28, 061104 (2018).
- A. Hart, J. Hook, and J. Dawes, Embedding and approximation theorems for echo state networks, Neural Networks 128, 234 (2020).
- L. Grigoryeva, A. Hart, and J.-P. Ortega, Chaos on compact manifolds: Differentiable synchronizations beyond the takens theorem, Phys. Rev. E 103, 062204 (2021).
- L. Grigoryeva, A. Hart, and J.-P. Ortega, Learning strange attractors with reservoir systems, Nonlinearity 36, 4674 (2023).
- A. Flynn, V. A. Tsachouridis, and A. Amann, Multifunctionality in a reservoir computer, Chaos: An Interdisciplinary Journal of Nonlinear Science 31, 013125 (2021).
- A. Flynn, V. A. Tsachouridis, and A. Amann, Seeing double with a multifunctional reservoir computer, Chaos: An Interdisciplinary Journal of Nonlinear Science 33, 113115 (2023).
- A. Röhm, D. J. Gauthier, and I. Fischer, Model-free inference of unseen attractors: Reconstructing phase space features from a single noisy trajectory using reservoir computing, Chaos: An Interdisciplinary Journal of Nonlinear Science 31, 103127 (2021).
- K. Burns and B. Hasselblatt, The sharkovsky theorem: A natural direct proof, The American Mathematical Monthly 118, 229 (2011).
- A. M. Blokh and O. M. Sharkovsky, Sharkovsky Ordering (Springer, Cham, 2022).
- T.-Y. Li and J. A. Yorke, Period three implies chaos, The American Mathematical Monthly 82, 985 (1975).
- R. Tokunaga, S. Kajiwara, and T. Matsumoto, Reconstructing bifurcation diagrams only from time-waveforms, Physica D: Nonlinear Phenomena 79, 348 (1994).
- M. Hara and H. Kokubu, Learning dynamics by reservoir computing (in memory of prof. pavol brunovský), Journal of Dynamics and Differential Equations 36, 515 (2024).
- J. Suykens, Nonlinear modelling and support vector machines, in IMTC 2001. Proceedings of the 18th IEEE Instrumentation and Measurement Technology Conference. Rediscovering Measurement in the Age of Informatics (Cat. No.01CH 37188), Vol. 1 (2001) pp. 287–294 vol.1.
- M. S. Advani, A. M. Saxe, and H. Sompolinsky, High-dimensional dynamics of generalization error in neural networks, Neural Networks 132, 428 (2020).
- T. Liang and A. Rakhlin, Just interpolate: Kernel “ridgeless” regression can generalize, The Annals of Statistics 48, 1329 (2020).
- G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, Extreme learning machine: Theory and applications, Neurocomputing 70, 489 (2006), neural Networks.
- M. Hermans and B. Schrauwen, Recurrent kernel machines: Computing with infinite echo state networks, Neural Computation 24, 104 (2012).
- C. Williams, Computing with infinite networks, in Advances in Neural Information Processing Systems, Vol. 9, edited by M. Mozer, M. Jordan, and T. Petsche (MIT Press, 1996).
- W. De Melo and S. Van Strien, One-Dimensional Dynamics (Springer, Berlin, 1993).
- S. v. Strien, T. Bedford, and H. Swift, Smooth dynamics on the interval (with an emphasis on quadratic-like maps), in New Directions in Dynamical Systems, London Mathematical Society Lecture Note Series (Cambridge University Press, 1988) p. 57–119.
- S. Banerjee, J. A. Yorke, and C. Grebogi, Robust chaos, Phys. Rev. Lett. 80, 3049 (1998).
- K. Nakajima, Physical reservoir computing—an introductory perspective, Japanese Journal of Applied Physics 59, 060501 (2020).