Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronisation and cryptography (1802.02844v2)

Published 8 Feb 2018 in cs.NE

Abstract: Using the machine learning approach known as reservoir computing, it is possible to train one dynamical system to emulate another. We show that such trained reservoir computers reproduce the properties of the attractor of the chaotic system sufficiently well to exhibit chaos synchronisation. That is, the trained reservoir computer, weakly driven by the chaotic system, will synchronise with the chaotic system. Conversely, the chaotic system, weakly driven by a trained reservoir computer, will synchronise with the reservoir computer. We illustrate this behaviour on the Mackey-Glass and Lorenz systems. We then show that trained reservoir computers can be used to crack chaos based cryptography and illustrate this on a chaos cryptosystem based on the Mackey-Glass system. We conclude by discussing why reservoir computers are so good at emulating chaotic systems.

Citations (72)

Summary

We haven't generated a summary for this paper yet.