Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Comparison of Reservoir Computing topologies using the Recurrent Kernel approach (2401.14557v3)

Published 25 Jan 2024 in cs.LG and cs.NE

Abstract: Reservoir Computing (RC) has become popular in recent years thanks to its fast and efficient computational capabilities. Standard RC has been shown to be equivalent in the asymptotic limit to Recurrent Kernels, which helps in analyzing its expressive power. However, many well-established RC paradigms, such as Leaky RC, Sparse RC, and Deep RC, are yet to be systematically analyzed in such a way. We define the Recurrent Kernel limit of all these RC topologies and conduct a convergence study for a wide range of activation functions and hyperparameters. Our findings provide new insights into various aspects of Reservoir Computing. First, we demonstrate that there is an optimal sparsity level which grows with the reservoir size. Furthermore, our analysis suggests that Deep RC should use reservoir layers of decreasing sizes. Finally, we perform a benchmark demonstrating the efficiency of Structured Reservoir Computing compared to vanilla and Sparse Reservoir Computing.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)
  1. Brain connectivity meets reservoir computing. PLoS Computational Biology 18, e1010639.
  2. Asymptotic stability in reservoir computing, in: 2022 International Joint Conference on Neural Networks (IJCNN), IEEE. pp. 01–08.
  3. Reservoir computing meets recurrent kernels and structured transforms. Advances in Neural Information Processing Systems 33, 16785–16796.
  4. Sparsity in reservoir computing neural networks, in: 2020 International Conference on Innovations in Intelligent SysTems and Applications (INISTA), IEEE. pp. 1–7.
  5. Architectural and markovian factors of echo state networks. Neural Networks 24, 440–456.
  6. Deep reservoir computing: A critical experimental analysis. Neurocomputing 268, 87–99.
  7. Next generation reservoir computing. Nature communications 12, 5564.
  8. Extreme learning machine: theory and applications. Neurocomputing 70, 489–501.
  9. The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148, 13.
  10. Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. science 304, 78–80.
  11. Optimization and applications of echo state networks with leaky-integrator neurons. Neural networks 20, 335–352.
  12. Reservoir computing approaches to recurrent neural network training. Computer Science Review 3, 127–149.
  13. Introduction to gaussian processes. NATO ASI series F computer and systems sciences 168, 133–166.
  14. Random vector functional link network: recent developments, applications, and future directions. Applied Soft Computing , 110377.
  15. Random features for large-scale kernel machines. Advances in neural information processing systems 20.
  16. Recent advances in physical reservoir computing: A review. Neural Networks 115, 100–123.
  17. Decoupled echo state networks with lateral inhibition. Neural Networks 20, 365–376.

Summary

We haven't generated a summary for this paper yet.