Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stable Reduced-Rank VAR Identification (2403.00237v4)

Published 1 Mar 2024 in stat.ME, cs.SY, and eess.SY

Abstract: The vector autoregression (VAR) has been widely used in system identification, econometrics, natural science, and many other areas. However, when the state dimension becomes large the parameter dimension explodes. So rank reduced modelling is attractive and is well developed. But a fundamental requirement in almost all applications is stability of the fitted model. And this has not been addressed in the rank reduced case. Here, we develop, for the first time, a closed-form formula for an estimator of a rank reduced transition matrix which is guaranteed to be stable. We show that our estimator is consistent and asymptotically statistically efficient and illustrate it in comparative simulations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. D. Bertsekas. Reinforcement Learning and Optimal Control. Athena Scientific, 2019.
  2. A constraint generation approach to learning stable linear dynamical systems. Advances in Neural Information Processing Systems, 20, 2007.
  3. J. P. Burg. Maximum Entropy Spectral Analysis. Stanford University, 1975.
  4. Constrained factor models for high-dimensional matrix-variate time series. Jl. Am. Stat. Assocn., 115:775–793, 2020.
  5. Autoregressive models for matrix-valued time series. Jl. Econometrics, 222:539–560, 2021.
  6. Realization of stable models with subspace methods. Automatica, 32(11):1587–1595, 1996.
  7. C. Davis and WM. Kahan. The rotation of eigenvectors by a perturbation. SIAM Jl. on Numerical Analysis, 7:1–46, 1970.
  8. JD. Hamilton. Time Series Analysis. Princeton Univ. Press, Princeton, NJ, 1994.
  9. Matrix Analysis. Cambridge University Press, Cambridge, UK, 2013.
  10. Efficient learning of a linear dynamical system with stability guarantees. IEEE Trans. on Autom. Contr., 68(5):2790–2804, 2023.
  11. T. Kailath. Linear Estimation. Prentice Hall, Upper Saddle River, New Jersey, 2000.
  12. H. Lutkepohl. Multivariate Time Series Analaysis. Springer, New York, 2005.
  13. JR. Magnus and H. Neudecker. Matrix Differential Calculus. J. Wiley, New York, 1999.
  14. New methods for the identification of a stable subspace model for dynamical systems. In IEEE Workshop on Machine Learning for Signal Processing, pages 432–437, 2008.
  15. Vector ARMA estimation: a reliable subspace approach. IEEE Trans. Sig. Proc., 48(7):2092–2104, 2000.
  16. Subspace identification with eigenvalue constraints. Automatica, 49(8):2468–2473, 2013.
  17. From self-tuning regulators to reinforcement learning and back again. In Proc. IEEE CDC, pages 3724–3740, 2019.
  18. A. H. Nuttall. Multivariate linear predictive spectral analysis employing weighted forward and backward averaging: A generalization of Burg’s algorithm. Report, Naval Underwater System Center, London, 1976.
  19. B. Recht. A tour of reinforcement learning: The view from continuous control. Annual Review of Control, Robotics, and Autonomous Systems, 2:253–279, 2019.
  20. Multivariate Reduced-Rank Regression: Theory and Applications, volume 136. Springer, 1998.
  21. X. Rong and V. Solo. State space subspace noise modeling with guaranteed stability. In Proc. IEEE CDC, page to appear, 2023.
  22. O. Strand. Multichannel complex maximum entropy (autoregressive) spectral analysis. IEEE Trans. Autom. Contr., 22(4):634–640, 1977.
  23. Reinforcement Learning: An Introduction. MIT press, 2018.
  24. H. Tanaka and T. Katayama. Stochastic subspace identification guaranteeing stability and minimum phase. IFAC Proceedings Volumes, 38(1):910–915, 2005.
  25. Maximum likelihood identification of stable linear dynamical systems. Automatica, 96:280–292, 2018.
  26. Identification of stable models in subspace identification by using regularization. IEEE Trans. Autom. Contr., 46:1416–1420, 2001.
  27. P. Van Overschee and B. De Moor. Subspace Identification for Linear Systems: Theory, Implementation, Applications. Kluwer, Boston, 1996.
  28. C. Weikert and M. B. Schulze. Evaluating dietary patterns: the role of reduced rank regression. Current Opinion in Clinical Nutrition and Metabolic Care, 19(5):341–346, 2016.
  29. A useful variant of the davis–kahan theorem for statisticians. Biometrika, 102:315–323, 2015.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com