State-space systems as dynamic generative models (2404.08717v3)
Abstract: A probabilistic framework to study the dependence structure induced by deterministic discrete-time state-space systems between input and output processes is introduced. General sufficient conditions are formulated under which output processes exist and are unique once an input process has been fixed, a property that in the deterministic state-space literature is known as the echo state property. When those conditions are satisfied, the given state-space system becomes a generative model for probabilistic dependences between two sequence spaces. Moreover, those conditions guarantee that the output depends continuously on the input when using the Wasserstein metric. The output processes whose existence is proved are shown to be causal in a specific sense and to generalize those studied in purely deterministic situations. The results in this paper constitute a significant stochastic generalization of sufficient conditions for the deterministic echo state property to hold, in the sense that the stochastic echo state property can be satisfied under contractivity conditions that are strictly weaker than those in deterministic situations. This means that state-space systems can induce a purely probabilistic dependence structure between input and output sequence spaces even when there is no functional relation between those two spaces.
- Model selection for weakly dependent time series forecasting. Bernoulli 18, 3 (2012), 883 – 913.
- A hybrid approach to atmospheric modeling that combines machine learning with a physics-based numerical model. Journal of Advances in Modeling Earth Systems 14, 3 (2022), e2021MS002712.
- Bollerslev, T. Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics 31, 3 (1986), 307–327.
- Fading memory and the problem of approximating nonlinear operators with Volterra series. IEEE Transactions on Circuits and Systems 32, 11 (1985), 1150–1161.
- Time Series: Theory and Methods. Springer-Verlag, 2006.
- Weak Dependence: With Examples and Applications. Springer Science & Business Media, 2007.
- Engle, R. F. Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50, 4 (1982), 987–1007.
- GARCH models, 2 ed. John Wiley & Sons, 2019.
- Risk bounds for reservoir computing. Journal of Machine Learning Research 21, 240 (2020), 1–61.
- Approximation error estimates for random neural networks and reservoir systems. The Annals of Applied Probability 33, 1 (2023), 28–69.
- Reservoir Computing Universality With Stochastic Inputs. IEEE Transactions on Neural Networks and Learning Systems 31, 1 (2020), 100–112.
- Fading memory echo state networks are universal. Neural Networks 138 (2021), 10–13.
- Chaos on compact manifolds: Differentiable synchronizations beyond the Takens theorem. Physical Review E - Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics 103 (2021), 062204.
- Learning strange attractors with reservoir systems. Nonlinearity 36 (2023), 4674–4708.
- Echo state networks are universal. Neural Networks 108 (2018), 495–508.
- Universal discrete-time reservoir computers with stochastic inputs and linear readouts using non-homogeneous state-affine systems. Journal of Machine Learning Research 19, 24 (2018), 1–40.
- Differentiable reservoir computing. Journal of Machine Learning Research 20, 179 (2019), 1–62.
- Set Functions. The University of New Mexico Press (1948), ix+324.
- Jaeger, H. The “echo state” approach to analysing and training recurrent neural networks – with an Erratum note. Tech. Rep. GMD Report 148, German National Research Center for Information Technology, 2010.
- Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science 304, 5667 (2004), 78–80.
- A brief survey on the approximation theory for sequence modelling. Journal of Machine Learning 2, 1 (2023), 1–30.
- Kalman, R. E. A new approach to linear filtering and prediction problems. Journal of Basic Engineering 82, 1 (1960), 35–45.
- Chaos, Fractals, and Noise: Stochastic Aspects of Dynamics, vol. 97. Springer Science & Business Media, 1994.
- Attractor reconstruction by machine learning. Chaos 28, 6 (2018).
- Manjunath, G. Stability and memory-loss go hand-in-hand: three results in dynamics and computation. Proceedings of the Royal Socienty A 476 (2020), 20200563.
- Manjunath, G. Embedding information onto a dynamical system. Nonlinearity 35, 3 (jan 2022), 1131.
- Transport in reservoir computing. Physica D: Nonlinear Phenomena 449 (2023), 133744.
- Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach. Physical Review Letters 120, 2 (2018), 24102.
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. Chaos 27, 12 (2017).
- Pfanzagl, P. Conditional Distributions as Derivatives. The Annals of Probability 7, 6 (1979), 1046 – 1050.
- Sontag, E. Mathematical Control Theory: Deterministic Finite Dimensional Systems. Springer-Verlag, 1998.
- Using data assimilation to train a hybrid forecast system that combines machine-learning and knowledge-based components. Chaos 31, 5 (2021), 53114.