Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Universality of reservoir systems with recurrent neural networks (2403.01900v2)

Published 4 Mar 2024 in cs.NE and cs.LG

Abstract: Approximation capability of reservoir systems whose reservoir is a recurrent neural network (RNN) is discussed. We show what we call uniform strong universality of RNN reservoir systems for a certain class of dynamical systems. This means that, given an approximation error to be achieved, one can construct an RNN reservoir system that approximates each target dynamical system in the class just via adjusting its linear readout. To show the universality, we construct an RNN reservoir system via parallel concatenation that has an upper bound of approximation error independent of each target in the class.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. Neural Network Learning: Theoretical Foundations. Cambridge University Press, 1999. https://doi.org/10.1017/CBO9780511624216.
  2. Andrew R. Barron. Neural net approximation. In K. S. Narendra, editor, Proceedings of the 7th Yale Workshop on Adaptive and Learning Systems, pages 69–72, 1992.
  3. Andrew R. Barron. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory, 39(3):930–945, 1993. https://doi.org/10.1109/18.256500.
  4. Neural network approximation and estimation of classifiers with classification boundary in a Barron class. The Annals of Applied Probability, 33(4):3039–3079, 2023. https://doi.org/10.1214/22-AAP1884.
  5. On generalization bounds of a family of recurrent neural networks. In Silvia Chiappa and Roberto Calandra, editors, Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, volume 108 of Proceedings of Machine Learning Research, pages 1233–1243, 2020. https://proceedings.mlr.press/v108/chen20d.html.
  6. Discrete-time signatures and randomness in reservoir computing. IEEE Transactions on Neural Networks and Learning Systems, 33(11):6321–6330, 2022. https://doi.org/10.1109/TNNLS.2021.3076777.
  7. Reservoir computing universality with stochastic inputs. IEEE Transactions on Neural Networks and Learning Systems, 31(1):100–112, 2020. https://doi.org/10.1109/TNNLS.2019.2899649.
  8. Fading memory echo state networks are universal. Neural Networks, 138:10–13, 2021. https://doi.org/10.1016/j.neunet.2021.01.025.
  9. Approximation bounds for random neural networks and reservoir systems. The Annals of Applied Probability, 33(1):28–69, 2023a. https://doi.org/10.1214/22-AAP1806.
  10. Infinite-dimensional reservoir computing, 2023b. preprint, https://doi.org/10.48550/arXiv.2304.00490.
  11. Echo state networks are universal. Neural Networks, 108:495–508, 2018a. https://doi.org/10.1016/j.neunet.2018.08.025.
  12. Universal discrete-time reservoir computers with stochastic inputs and linear readouts using non-homogeneous state-affine systems. Journal of Machine Learning Research, 19(24):1–40, 2018b. https://jmlr.org/papers/volume19/18-020/18-020.pdf.
  13. Stochastic nonlinear time series forecasting using time-delay reservoir computers: Performance and universality. Neural Networks, 55:59–71, 2014. https://doi.org/10.1016/j.neunet.2014.03.004.
  14. Embedding and approximation theorems for echo state networks. Neural Networks, 128:234–247, 2020. https://doi.org/10.1016/j.neunet.2020.05.013.
  15. Herbert Jaeger. The “echo state” approach to analysing and training recurrent neural networks, 2001. GMD Report 148, German National Research Center for Information Technology. https://doi.org/10.24406/publica-fhg-291111.
  16. Achim Klenke. Probability Theory: A Comprehensive Course. Springer Nature Switzerland AG, 3rd edition, 2020. https://doi.org/10.1007/978-3-030-56402-5.
  17. Universality and approximation bounds for echo state networks with random weights. IEEE Transactions on Neural Networks and Learning Systems ( Early Access ), pages 1–13, 2023. https://doi.org/10.1109/TNNLS.2023.3339512.
  18. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Computation, 14(11):2531–2560, 2002. https://doi.org/10.1162/089976602760407955.
  19. Foundations of Machine Learning. MIT Press, 2nd edition, 2018.
  20. Echo state networks and neural network ensembles to predict sunspots activity. In Proceedings of 17th European Symposium on Artificial Neural Networks, pages 379–384, 2009. https://www.esann.org/sites/default/files/proceedings/legacy/es2009-99.pdf.
  21. Joel H. Shapiro. A Fixed-Point Farrago. Springer International Publishing Switzerland, 2016. https://doi.org/10.1007/978-3-319-27978-7.
  22. Recent advances in physical reservoir computing: A review. Neural Networks, 115:100–123, 2019. https://doi.org/10.1016/j.neunet.2019.03.005.
  23. Martin J. Wainwright. High-Dimensional Statistics: A Non-Asymptotic Viewpoint. Cambridge University Press, 2019. https://doi.org/10.1017/9781108627771.
  24. Complexities of reservoir systems for classification and regression, 2023. under review.

Summary

We haven't generated a summary for this paper yet.