2000 character limit reached
Attractor-merging Crises and Intermittency in Reservoir Computing (2504.12695v1)
Published 17 Apr 2025 in nlin.CD, cs.LG, cs.NE, and math.DS
Abstract: Reservoir computing can embed attractors into random neural networks (RNNs), generating a ``mirror'' of a target attractor because of its inherent symmetrical constraints. In these RNNs, we report that an attractor-merging crisis accompanied by intermittency emerges simply by adjusting the global parameter. We further reveal its underlying mechanism through a detailed analysis of the phase-space structure and demonstrate that this bifurcation scenario is intrinsic to a general class of RNNs, independent of training data.