Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 37 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 10 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4 31 tok/s Pro
2000 character limit reached

Efficient Batch and Recursive Least Squares for Matrix Parameter Estimation (2404.10911v2)

Published 16 Apr 2024 in eess.SP, cs.SY, and eess.SY

Abstract: Traditionally, batch least squares (BLS) and recursive least squares (RLS) are used for identification of a vector of parameters that form a linear model. In some situations, however, it is of interest to identify parameters in a matrix structure. In this case, a common approach is to transform the problem into standard vector form using the vectorization (vec) operator and the Kronecker product, known as vec-permutation. However, the use of the Kronecker product introduces extraneous zero terms in the regressor, resulting in unnecessary additional computational and space requirements. This work derives matrix BLS and RLS formulations which, under mild assumptions, minimize the same cost as the vec-permutation approach. This new approach requires less computational complexity and space complexity than vec-permutation in both BLS and RLS identification. It is also shown that persistent excitation guarantees convergence to the true matrix parameters. This method can used to improve computation time in the online identification of multiple-input, multiple-output systems for indirect adaptive model predictive control.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)
  1. S. Islam and D. S. Bernstein, “Recursive least squares for real-time implementation,” IEEE Ctrl. Sys. Mag., vol. 39, no. 3, pp. 82–85, 2019.
  2. T. W. Nguyen, S. A. U. Islam, D. S. Bernstein, and I. V. Kolmanovsky, “Predictive cost adaptive control: A numerical investigation of persistency, consistency, and exigency,” IEEE Control Systems Magazine, vol. 41, no. 6, pp. 64–96, 2021.
  3. H. V. Henderson and S. R. Searle, “The vec-permutation matrix, the vec operator and kronecker products: A review,” Linear and multilinear algebra, vol. 9, no. 4, pp. 271–288, 1981.
  4. S. A. U. Islam, T. W. Nguyen, I. V. Kolmanovsky, and D. S. Bernstein, “Data-driven retrospective cost adaptive control for flight control applications,” Journal of Guidance, Control, and Dynamics, vol. 44, no. 10, pp. 1732–1758, 2021.
  5. K. Zhu, C. Yu, and Y. Wan, “Recursive least squares identification with variable-direction forgetting via oblique projection decomposition,” Journal of Automatica Sinica, vol. 9, no. 3, pp. 547–555, 2021.
  6. F. Ding, “Coupled-least-squares identification for multivariable systems,” IET Control Theory & Applications, vol. 7, no. 1, pp. 68–79, 2013.
  7. Y. Wang, F. Ding, and M. Wu, “Recursive parameter estimation algorithm for multivariate output-error systems,” Journal of the Franklin Institute, vol. 355, no. 12, pp. 5163–5181, 2018.
  8. N. Mohseni and D. S. Bernstein, “Predictive cost adaptive control of flexible structures with harmonic and broadband disturbances,” in 2022 American Control Conference (ACC).   IEEE, 2022, pp. 3198–3203.
  9. A. Farahmandi and B. Reitz, “Predictive cost adaptive control of a planar missile with unmodeled aerodynamics,” in AIAA SCITECH 2024 Forum, 2024, p. 2218.
  10. H. Ma, J. Pan, L. Lv, G. Xu, F. Ding, A. Alsaedi, and T. Hayat, “Recursive algorithms for multivariable output-error-like arma systems,” Mathematics, vol. 7, no. 6, p. 558, 2019.
  11. V. Peterka, “A square root filter for real time multivariate regression,” Kybernetika, vol. 11, no. 1, pp. 53–67, 1975.
  12. F. Ding, P. X. Liu, and G. Liu, “Multiinnovation least-squares identification for system modeling,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 40, no. 3, pp. 767–778, 2009.
  13. F. Ding and T. Chen, “Performance analysis of multi-innovation gradient type identification methods,” Automatica, vol. 43, no. 1, pp. 1–14, 2007.
  14. B. Bamieh and L. Giarre, “Identification of linear parameter varying models,” International Journal of Robust and Nonlinear Control: IFAC-Affiliated Journal, vol. 12, no. 9, pp. 841–853, 2002.
  15. A. L. Bruce, A. Goel, and D. S. Bernstein, “Necessary and sufficient regressor conditions for the global asymptotic stability of recursive least squares,” Systems & Control Letters, vol. 157, p. 105005, 2021.
  16. R. M. Johnstone, C. R. Johnson Jr, R. R. Bitmead, and B. D. Anderson, “Exponential convergence of recursive least squares with exponential forgetting factor,” Sys. & Control Letters, vol. 2, no. 2, pp. 77–82, 1982.
  17. B. Lai, S. A. U. Islam, and D. S. Bernstein, “Regularization-induced bias and consistency in recursive least squares,” in 2021 American Control Conference (ACC).   IEEE, 2021, pp. 3987–3992.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com