Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
86 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
53 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Learning theory for dynamical systems (2208.05349v3)

Published 10 Aug 2022 in math.DS and math.FA

Abstract: The task of modelling and forecasting a dynamical system is one of the oldest problems, and it remains challenging. Broadly, this task has two subtasks - extracting the full dynamical information from a partial observation; and then explicitly learning the dynamics from this information. We present a mathematical framework in which the dynamical information is represented in the form of an embedding. The framework combines the two subtasks using the language of spaces, maps, and commutations. The framework also unifies two of the most common learning paradigms - delay-coordinates and reservoir computing. We use this framework as a platform for two other investigations of the reconstructed system - its dynamical stability; and the growth of error under iterations. We show that these questions are deeply tied to more fundamental properties of the underlying system - the behavior of matrix cocycles over the base dynamics, its non-uniform hyperbolic behavior, and its decay of correlations. Thus, our framework bridges the gap between universally observed behavior of dynamics modelling; and the spectral, differential and ergodic properties intrinsic to the dynamics.

Summary

We haven't generated a summary for this paper yet.