Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Networks as Geometric Chaotic Maps (1912.05081v4)

Published 11 Dec 2019 in cs.LG, math.DS, nlin.CD, and stat.ML

Abstract: The use of artificial neural networks as models of chaotic dynamics has been rapidly expanding. Still, a theoretical understanding of how neural networks learn chaos is lacking. Here, we employ a geometric perspective to show that neural networks can efficiently model chaotic dynamics by becoming structurally chaotic themselves. We first confirm neural network's efficiency in emulating chaos by showing that a parsimonious neural network trained only on few data points can reconstruct strange attractors, extrapolate outside training data boundaries, and accurately predict local divergence rates. We then posit that the trained network's map comprises sequential geometric stretching, rotation, and compression operations. These geometric operations indicate topological mixing and chaos, explaining why neural networks are naturally suitable to emulate chaotic dynamics.

Citations (7)

Summary

We haven't generated a summary for this paper yet.