Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Rank-Reduced Kalman Filter: Approximate Dynamical-Low-Rank Filtering In High Dimensions (2306.07774v3)

Published 13 Jun 2023 in stat.ML and cs.LG

Abstract: Inference and simulation in the context of high-dimensional dynamical systems remain computationally challenging problems. Some form of dimensionality reduction is required to make the problem tractable in general. In this paper, we propose a novel approximate Gaussian filtering and smoothing method which propagates low-rank approximations of the covariance matrices. This is accomplished by projecting the Lyapunov equations associated with the prediction step to a manifold of low-rank matrices, which are then solved by a recently developed, numerically stable, dynamical low-rank integrator. Meanwhile, the update steps are made tractable by noting that the covariance update only transforms the column space of the covariance matrix, which is low-rank by construction. The algorithm differentiates itself from existing ensemble-based approaches in that the low-rank approximations of the covariance matrices are deterministic, rather than stochastic. Crucially, this enables the method to reproduce the exact Kalman filter as the low-rank dimension approaches the true dimensionality of the problem. Our method reduces computational complexity from cubic (for the Kalman filter) to \emph{quadratic} in the state-space size in the worst-case, and can achieve \emph{linear} complexity if the state-space model satisfies certain criteria. Through a set of experiments in classical data-assimilation and spatio-temporal regression, we show that the proposed method consistently outperforms the ensemble-based methods in terms of error in the mean and covariance with respect to the exact Kalman filter. This comes at no additional cost in terms of asymptotic computational complexity.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. Doubly sparse variational Gaussian processes. In Chiappa, S. and Calandra, R., editors, Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, volume 108 of Proceedings of Machine Learning Research, pages 2874–2884. PMLR.
  2. Australian Data Archive for Meteorology (ADAM) (2023). Australian Water Availability Project. http://www.bom.gov.au/jsp/awap/index.jsp. [Online; accessed May 2023].
  3. Discrete-time solutions to the continuous-time differential Lyapunov equation with applications to Kalman filtering. IEEE Transactions on Automatic Control, 60(3):632–643.
  4. Adaptive sampling with the ensemble transform Kalman filter. part i: Theoretical aspects. Monthly weather review, 129(3):420–436.
  5. Inference in Hidden Markov Models. Springer Series in Statistics. Springer.
  6. Data assimilation in the geosciences: An overview of methods, issues, and perspectives. Wiley Interdisciplinary Reviews: Climate Change, 9(5).
  7. Machine learning meets Kalman filtering. In 2016 IEEE 55th conference on decision and control (CDC), pages 4594–4599. IEEE.
  8. An unconventional robust integrator for dynamical low-rank approximation. BIT. Numerical Mathematics, 62(1):23–44.
  9. Tikhonov regularization within ensemble Kalman inversion. SIAM Journal on Numerical Analysis, 58(2):1263–1294.
  10. Fixed Rank Filtering for Spatio-Temporal Data. Journal of Computational and Graphical Statistics, 19(3):724–745.
  11. Evensen, G. (1994). Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. Journal of Geophysical Research: Oceans, 99(C5):10143–10162.
  12. Evensen, G. (2009a). Data assimilation: the ensemble Kalman filter, volume 2. Springer.
  13. Evensen, G. (2009b). The ensemble Kalman filter for combined state and parameter estimation. IEEE Control Systems Magazine, 29(3):83–104.
  14. Evensen, G. (2021). Formulating the history matching problem with consistent error statistics. Computational Geosciences, 25(3).
  15. State estimation using a reduced-order kalman filter. Journal of the Atmospheric Sciences, 58(23):3666–3680.
  16. A solution to the dynamical inverse problem of EEG generation using spatiotemporal Kalman filtering. NeuroImage, 23(2):435–453.
  17. Data assimilation in meteorology and oceanography. In Advances in geophysics, volume 33, pages 141–266. Elsevier.
  18. Matrix computations. JHU press.
  19. Kalman Filtering: Theory and Practice Using MATLAB. Wiley.
  20. Spatio-temporal variational Gaussian processes. Advances in Neural Information Processing Systems, 34:23621–23633.
  21. A hybrid ensemble Kalman filter–3D variational analysis scheme. Monthly Weather Review, 128(8):2905–2919.
  22. Gaussian processes for big data. In Proceedings of the 29th Conference on Uncertainty in Artificial Intelligence (UAI), pages 282–290.
  23. Imperial College London (2020). Londonair – London air quality network (LAQN). https://www.londonair.org.uk.
  24. High-quality spatial climate data-sets for Australia. Australian Meteorological and Oceanographic Journal, 58(04):233–248.
  25. Kalman, R. E. (1960). A new approach to linear filtering and prediction problems. Transactions of the ASME–Journal of Basic Engineering, 82(Series D):35–45.
  26. Understanding the ensemble Kalman filter. The American Statistician, 70(4):350–357.
  27. Dynamical low-rank approximation. SIAM Journal on Matrix Analysis and Applications, 29(2):434–454.
  28. Sparse reduced-order modelling: sensor-based dynamics to full-state estimation. Journal of Fluid Mechanics, 844:459–490.
  29. A projector-splitting integrator for dynamical low-rank approximation. BIT Numerical Mathematics, 54(1):171–188.
  30. A robust data-driven koopman kalman filter for power systems dynamic state estimation. IEEE Transactions on Power Systems, 33(6):7228–7237.
  31. Random features for large-scale kernel machines. In Advances in Neural Information Processing Systems (NeurIPS).
  32. The ensemble Kalman filter: a signal processing perspective. EURASIP Journal on Advances in Signal Processing, 2017(1):1–16.
  33. A deterministic formulation of the ensemble Kalman filter: An alternative to ensemble square root filters. Tellus A: Dynamic Meteorology and Oceanography, 60(2):361.
  34. Särkkä, S. (2013). Bayesian Filtering and Smoothing. Cambridge University Press.
  35. Infinite-dimensional Kalman filtering approach to spatio-temporal Gaussian process regression. In Artificial Intelligence and Statistics, pages 993–1001. PMLR.
  36. Applied Stochastic Differential Equations. Cambridge University Press.
  37. Spatiotemporal learning via infinite-dimensional Bayesian filtering and smoothing: A look at Gaussian process regression through Kalman filtering. IEEE Signal Processing Magazine, 30(4):51–61.
  38. Sparse Gaussian processes using pseudo-inputs. Advances in Neural Information Processing Systems (NeurIPS), 18:1257–1264.
  39. Solin, A. (2016). Stochastic Differential Equation Methods for Spatio-Temporal Gaussian Process Regression. Doctoral thesis, Aalto University, School of Science.
  40. Stengel, R. (1994). Optimal Control and Estimation. Dover Publications.
  41. Stuart, A. M. (2010). Inverse problems: a Bayesian perspective. Acta numerica, 19:451–559.
  42. Combining pseudo-point and state space approximations for sum-separable Gaussian processes. In de Campos, C. and Maathuis, M. H., editors, Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, volume 161 of Proceedings of Machine Learning Research, pages 1607–1617. PMLR.
  43. Ensemble square root filters. Monthly weather review, 131(7):1485–1490.
  44. Titsias, M. (2009). Variational learning of inducing variables in sparse Gaussian processes. In Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS), pages 567–574. PMLR.
  45. Efficient spatio-temporal Gaussian regression via Kalman filtering. Automatica, 118:109032.
  46. Fenrir: Physics-enhanced regression for initial value problems. In Chaudhuri, K., Jegelka, S., Song, L., Szepesvari, C., Niu, G., and Sabato, S., editors, Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pages 21776–21794. PMLR.
  47. Dynamic mode decomposition and robust estimation: Case study of a 2d turbulent boussinesq flow. In 2020 American Control Conference (ACC), pages 2351–2356. IEEE.
  48. Using the Nyström method to speed up kernel machines. In Advances in Neural Information Processing Systems (NeurIPS).
Citations (9)

Summary

We haven't generated a summary for this paper yet.