Papers
Topics
Authors
Recent
2000 character limit reached

Embedding and Approximation Theorems for Echo State Networks

Published 14 Aug 2019 in nlin.CD and math.DS | (1908.05202v2)

Abstract: Echo State Networks (ESNs) are a class of single layer recurrent neural networks that have enjoyed recent attention. In this paper we prove that a suitable ESN, trained on a series of measurements of an invertible dynamical system, induces a C1 map from the dynamical system's phase space to the ESN's reservoir space. We call this the Echo State Map. We then prove that the Echo State Map is generically an embedding with positive probability. Under additional mild assumptions, we further conjecture that the Echo State Map is almost surely an embedding. For sufficiently large, and specially structured, but still randomly generated ESNs, we prove that there exists a linear readout layer that allows the ESN to predict the next observation of a dynamical system arbitrarily well. Consequently, if the dynamical system under observation is structurally stable then the trained ESN will exhibit dynamics that are topologically conjugate to the future behaviour of the observed dynamical system. Our theoretical results connect the theory of ESNs to the delay-embedding literature for dynamical systems, and are supported by numerical evidence from simulations of the traditional Lorenz equations. The simulations confirm that, from a one dimensional observation function, an ESN can accurately infer a range of geometric and topological features of the dynamics such as the eigenvalues of equilibrium points, Lyapunov exponents and homology groups.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.