Chaotic attractor reconstruction using small reservoirs -- the influence of topology (2402.16888v1)
Abstract: Forecasting timeseries based upon measured data is needed in a wide range of applications and has been the subject of extensive research. A particularly challenging task is the forecasting of timeseries generated by chaotic dynamics. In recent years reservoir computing has been shown to be an effective method of forecasting chaotic dynamics and reconstructing chaotic attractors from data. In this work strides are made toward smaller and lower complexity reservoirs with the goal of improved hardware implementability and more reliable production of adequate surrogate models. We show that a reservoir of uncoupled nodes more reliably produces long term timeseries predictions than complex reservoir topologies. We then link the improved attractor reconstruction of the uncoupled reservoir with smaller spectral radii of the resulting surrogate systems. These results indicate that, the node degree plays an important role in determining whether the desired dynamics will be stable in the autonomous surrogate system which is attained via closed-loop operation of the trained reservoir. In terms of hardware implementability, uncoupled nodes would allow for greater freedom in the hardware architecture because no complex coupling setups are needed and because, for uncoupled nodes, the system response is equivalent for space and time multiplexing.
- Jaeger H 2001 The ’echo state’ approach to analysing and training recurrent neural networks GMD Report 148 GMD - German National Research Institute for Computer Science
- Maass W, Natschläger T and Markram H 2002 Neural. Comput. 14 2531 ISSN 0899-7667
- Lu Z, Hunt B R and Ott E 2018 Chaos 28 061104 ISSN 1054-1500 (Preprint https://pubs.aip.org/aip/cha/article-pdf/doi/10.1063/1.5039508/14616258/061104_1_online.pdf) URL https://doi.org/10.1063/1.5039508
- Haluszczynski A and Räth C 2019 Chaos 29 103143 ISSN 1054-1500 (Preprint https://pubs.aip.org/aip/cha/article-pdf/doi/10.1063/1.5118725/14626476/103143_1_online.pdf) URL https://doi.org/10.1063/1.5118725
- Griffith A, Pomerance A and Gauthier D J 2019 Chaos 29 123108
- Racca A and Magri L 2021 Neural Netw. 142 252–268 ISSN 0893-6080
- Ma H, Prosperino D and Räth C 2023 Sci. Rep. 13 12970 ISSN 2045-2322 URL https://doi.org/10.1038/s41598-023-39886-w
- Brunton S L, Proctor J L and Kutz J N 2016 PNAS 113 3932–3937 (Preprint https://www.pnas.org/doi/pdf/10.1073/pnas.1517384113)
- Gilpin W 2023 Phys. Rev. Res. 5(4) 043252 URL https://link.aps.org/doi/10.1103/PhysRevResearch.5.043252
- Röhm A, Gauthier D J and Fischer I 2021 Chaos 31 103127 (Preprint https://doi.org/10.1063/5.0065813)
- Zhai Z M, Kong L W and Lai Y C 2023 Phys. Rev. Res. 5(3) 033127 URL https://link.aps.org/doi/10.1103/PhysRevResearch.5.033127
- Röhm A, Jaurigue L C and Lüdge K 2019 IEEE J. Sel. Top. Quantum Electron. 26 7700108
- Sugano C, Kanno K and Uchida A 2020 IEEE J. Sel. Top. Quantum Electron. 26 1500409 ISSN 1558-4542
- Dion G, Mejaouri S and Sylvestre J 2018 J. Appl. Phys. 124 152132
- Chen J, Nurdin H I and Yamamoto N 2020 Phys. Rev. Applied 14(2) 024065
- Pfeffer P, Heyder F and Schumacher J 2022 Phys. Rev. Research 4(3) 033176
- Huang G B, Zhu Q Y and Siew C K 2004 IEEE proceedings 2 985–990
- Lorenz E N 1963 J. Atmos. Sci. 20 130
- Kantz H and Schreiber T 2003 Nonlinear Time Series Analysis 2nd ed (Cambridge University Press)
- Storm L, Gustavsson K and Mehlig B 2022 Mach. Learn.: Sci. Technol. 3 045021
- Viehweg J, Worthmann K and Mäder P 2023 Neurocomputing 522 214–228 ISSN 0925-2312 URL https://www.sciencedirect.com/science/article/pii/S0925231222014291
- Lukosevicius M and Jaeger H 2009 Computer Science Review 3 127–149 ISSN 1574-0137
- Hart A G, Hook J L and Dawes J H 2020 Neural Netw. 128 234–247 ISSN 0893-6080
- Jaurigue L and Lüdge K 2024 Neuromorph. Comput. Eng. 4 014001 URL https://dx.doi.org/10.1088/2634-4386/ad1d32