Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Outlier-robust Kalman Filtering through Generalised Bayes (2405.05646v2)

Published 9 May 2024 in stat.ML, cs.LG, cs.SY, and eess.SY

Abstract: We derive a novel, provably robust, and closed-form Bayesian update rule for online filtering in state-space models in the presence of outliers and misspecified measurement models. Our method combines generalised Bayesian inference with filtering methods such as the extended and ensemble Kalman filter. We use the former to show robustness and the latter to ensure computational efficiency in the case of nonlinear models. Our method matches or outperforms other robust filtering methods (such as those based on variational Bayes) at a much lower computational cost. We show this empirically on a range of filtering problems with outlier measurements, such as object tracking, state estimation in high-dimensional chaotic systems, and online learning of neural networks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (73)
  1. Approximate inference in state-space models with heavy-tailed noise. IEEE Transactions on Signal Processing, 2012.
  2. Robust filtering and smoothing via gaussian mixtures. NTIS, SPRINGFIELD, VA, 1980, 34, 1980.
  3. Concentration of tempered posteriors and of their variational approximations. 2020.
  4. Robust and scalable Bayesian online changepoint detection. In International Conference on Machine Learning, 2023a.
  5. Robust and conjugate Gaussian process regression. arXiv:2311.00463, 2023b.
  6. Anderson, J. L. A monte carlo implementation of the nonlinear filtering problem to produce ensemble assimilations and forecasts. Monthly weather review, 127(12):2741–2758, 1999.
  7. Stochastic parametrizations and model uncertainty in the Lorenz’96 system. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 371(1991):20110479, 2013.
  8. Deep attentive survival analysis in limit order books: Estimating fill probabilities with convolutional-transformers. Quantitative Finance, pp.  1–23, 2024.
  9. Minimum Stein discrepancy estimators. In Neural Information Processing Systems, pp.  12964–12976, 2019.
  10. Implicit maximum a posteriori filtering via adaptive optimization. arXiv:2311.10580, 2023.
  11. Bayesian fractional posteriors. 2019.
  12. Bishop, C. M. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, 2006.
  13. A general framework for updating belief distributions. Journal of the Royal Statistical Society Series B: Statistical Methodology, 78(5):1103–1130, 2016.
  14. An approach to robust Kalman filtering. In IEEE Conference on Decision and Control, pp.  304–305, 1983.
  15. Generalised Bayesian filtering via sequential monte carlo. Advances in neural information processing systems, 33:418–429, 2020.
  16. JAX: composable transformations of Python+NumPy programs, 2018. URL http://github.com/google/jax.
  17. Analysis scheme in the ensemble Kalman filter. Monthly weather review, 126(6):1719–1724, 1998.
  18. Robust filtering. Journal of the American Statistical Association, 110(512):1591–1606, 2015.
  19. Robust bayesian inference for moving horizon estimation. arXiv preprint arXiv:2210.02166, 2022.
  20. Detecting toxic flow. arXiv:2312.05827, 2023.
  21. Low-rank extended Kalman filtering for online learning of neural networks from streaming data. In Conference on Lifelong Learning Agents, 2023.
  22. Stein point markov chain Monte Carlo. In International Conference on Machine Learning, pp.  1011–1021, 2019.
  23. Das, S. Robust state estimation methods for robotics applications. PhD thesis, West Virginia University, 2023.
  24. A Stein variational newton method. Advances in Neural Information Processing Systems, 2018.
  25. Robustifying likelihoods by optimistically re-weighting data. arXiv:2303.10525, 2023.
  26. Efficient online Bayesian inference for neural bandits. In International Conference on Artificial Intelligence and Statistics, 2022.
  27. Evensen, G. Sequential data assimilation with a nonlinear quasi-geostrophic model using monte carlo methods to forecast error statistics. Journal of Geophysical Research: Oceans, 99(C5):10143–10162, 1994.
  28. Evensen, G. Data Assimilation: The Ensemble Kalman Filter. Springer, 2nd ed. 2009 edition edition, 2009.
  29. Martingale posterior distributions. arXiv:2103.15671, 2021.
  30. Frazier, P. I. A tutorial on Bayesian optimization. arXiv:1807.02811, 2018.
  31. Assumed density filtering methods for learning Bayesian neural networks. In Conference on Artificial Intelligence and Interactive Digital Entertainment, 2016.
  32. Grubbs, F. E. Procedures for detecting outlying observations in samples. Technometrics, pp.  1–21, 1969.
  33. Grünwald, P. The safe Bayesian. In International Conference on Algorithmic Learning Theory, pp.  169–183, 2012.
  34. Inconsistency of Bayesian inference for misspecified linear models, and a proposal for repairing it. Bayesian Analysis, 12(4):1069 – 1103, 2017.
  35. Robust statistics: the approach based on influence functions. John Wiley & Sons, 2011.
  36. Assigning a value to a power likelihood in a general Bayesian model. Biometrika, 104(2):497–503, 2017.
  37. Matrix analysis. Cambridge university press, 2012.
  38. A robust Gaussian approximate filter for nonlinear systems with heavy tailed measurement noises. In IEEE International Conference on Acoustics, Speech and Signal Processing, pp.  4209–4213, 2016.
  39. Huber, P. J. Robust statistics. Wiley Series in Probability and Mathematical Statistics, 1981.
  40. Adversarial interpretation of Bayesian inference. In International Conference on Algorithmic Learning Theory, pp.  553–572. PMLR, 2022.
  41. General Bayesian loss function selection and the use of improper models. Journal of the Royal Statistical Society Series B: Statistical Methodology, pp.  1640–1665, 2022.
  42. Principled Bayesian minimum divergence inference. Entropy, pp.  442, 2018.
  43. Kalman, R. E. A new approach to linear filtering and prediction problems. Journal of Basic Engineering, 82(Series D):35–45, 1960.
  44. Karlgaard, C. D. Nonlinear regression Huber–Kalman filtering and fixed-interval smoothing. Journal of guidance, control, and dynamics, 38(2):322–330, 2015.
  45. Adam: A method for stochastic optimization, 2017.
  46. Doubly robust bayesian inference for non-stationary streaming data with b⁢e⁢t⁢a𝑏𝑒𝑡𝑎betaitalic_b italic_e italic_t italic_a-divergences. Advances in Neural Information Processing Systems, 31, 2018.
  47. An optimization-centric view on Bayes’ rule: Reviewing and generalizing variational inference. Journal of Machine Learning Research, 23(132):1–109, 2022.
  48. The recursive variational Gaussian approximation (R-VGA). Statistics and Computing., 32(1):10, 2021.
  49. The limited-memory recursive variational Gaussian approximation (L-RVGA). Statistics and Computing, 33(70), 2023. doi: 10.1007/s11222-023-10239-x. URL https://inria.hal.science/hal-03501920.
  50. Liu, G. Data quality problems troubling business and financial researchers: A literature review and synthetic analysis. Journal of Business & Finance Librarianship, 25(3-4):315–371, 2020.
  51. Lorenz, E. N. Predictability – a problem partly solved, pp.  40–58. Cambridge University Press, 2006.
  52. Masreliez, C. Approximate non-gaussian filtering with linear state and observation relations. IEEE Transactions on Automatic Control, 20(1):107–110, 1975.
  53. Robust bayesian estimation for the linear model and robustifying the kalman filter. IEEE transactions on Automatic Control, 22(3):361–371, 1977.
  54. Robust generalised Bayesian inference for intractable likelihoods. Journal of the Royal Statistical Society: Series B, 84(3):997–1022, 2022.
  55. Generalized Bayesian inference for discrete intractable likelihood. Journal of the American Statistical Association, pp.  1–11, 2023.
  56. The structure and performance of estimators for real-time estimation of randomly varying time delay. IEEE transactions on acoustics, speech, and signal processing, 32(1):81–94, 1984.
  57. Robust Bayesian inference via coarsening. Journal of the American Statistical Association, 2018.
  58. Morris, J. The Kalman filter: A robust estimator for some classes of linear quadratic problems. IEEE Transactions on Information Theory, 22(5):526–534, 1976.
  59. Novel outlier-resistant extended Kalman filter for robust online structural identification. Journal of Engineering Mechanics, 141(1):04014100, 2015.
  60. Murphy, K. P. Probabilistic Machine Learning: Advanced Topics. MIT Press, 2023.
  61. Nogueira, F. Bayesian Optimization: Open source constrained global optimization tool for Python, 2014. URL https://github.com/fmfn/BayesianOptimization.
  62. Robust inference for state-space models with skewed measurement noise. IEEE Signal Processing Letters, 22(11):1898–1902, 2015.
  63. Ollivier, Y. Online natural gradient as a Kalman filter. Electronic Journal of Statistics, 12(2):2930–2961, 2018.
  64. Recursive outlier-robust filtering and smoothing for nonlinear systems using the multivariate student-t distribution. In IEEE International Workshop on Machine Learning for Signal Processing, pp.  1–6, 2012.
  65. Optimal thinning of MCMC output. arXiv:2005.03952, 2022.
  66. Observation quality control with a robust ensemble Kalman filter. Monthly Weather Review, 141(12):4414–4428, 2013.
  67. The ensemble Kalman filter: a signal processing perspective. Journal on Advances in Signal Processing, 2017:1–16, 2017a.
  68. The ensemble Kalman filter: a signal processing perspective. EURASIP Journal on Advances in Signal Processing, pp.  1–16, 2017b.
  69. Bayesian Filtering and Smoothing (2nd edition). Cambridge University Press, 2023.
  70. Robust recursive estimation in the presence of heavy-tailed observation noise. The annals of statistics, 22(2):1045–1080, 1994.
  71. Learning an outlier-robust kalman filter. In European Conference on Machine Learning, pp.  748–756. Springer, 2007.
  72. Robust Gaussian Kalman filter with outlier detection. IEEE Signal Processing Letters, 25(8):1236–1240, 2018.
  73. West, M. Robust sequential approximate bayesian estimation. Journal of the Royal Statistical Society Series B: Statistical Methodology, 43(2):157–166, 1981.
Citations (6)

Summary

  • The paper introduces WoLF, a new method that integrates generalized Bayes into Kalman filtering to mitigate outlier effects and model inaccuracies.
  • The approach maintains closed-form updates and computational efficiency while being applicable to both linear and non-linear state-space models.
  • The method's flexibility in selecting weighting functions enables robust performance across diverse real-time applications such as navigation and forecasting.

Enhancing Robustness of Kalman Filters with Generalised Bayes

Introduction

Kalman Filters (KFs) are staple tools in signal processing, object tracking, and other areas dealing with dynamic systems and time series data. However, they can struggle with outliers or model misspecifications due to their base assumption on data normality, which isn't always a practical scenario in real-world applications. The paper introduces a novel variation of the Kalman Filter, which integrates concepts from generalized Bayesian inference to achieve robustness against outliers and model misspecification in both linear and non-linear state-space models.

What's New: The WoLF Approach

The proposed approach, dubbed the Weighted Observation Likelihood Filter (WoLF), adapts the standard Kalman Filter process by introducing a weight into the observation likelihood function, effectively dampening the influence of data points identified as potential outliers based on a specified weighting function. Unlike traditional robust filtering methods, WoLF maintains the computational efficiency of the standard Kalman Filter by preserving closed-form update equations.

  • Integration with Existing Filters: WoLF can be implemented in conjunction with traditional and Enhanced Kalman Filters (EKF) as well as the Ensemble Kalman Filter (EnKF), showing its versatility across various Kalman Filtering techniques.
  • Weighting Functions: The flexibility of the approach also extends to the choice of the weighting function, which can be tailored to specific applications or outlier characteristics, such as using inverse multi-quadratic or Mahalanobis-based functions.
  • Computational Efficiency: Even with the introduction of weighting, the method does not introduce significant computational overhead, retaining a complexity comparable to that of the original filters.

Key Advantages

WoLF presents several compelling advantages over existing methods:

  1. Closed-form Updates: Keeping updates closed-form where possible ensures lower computational overhead and easier implementation.
  2. Robustness to Outliers: By dampening outlier influences through weighting, WoLF provides more reliable performance in the presence of atypical or erroneous data.
  3. Theoretical Soundness: The paper provides proof of the approach's robustness in theoretical terms, ensuring that its applications are grounded in solid statistical principles.
  4. Flexibility: Easy adaptation to different types of Kalman Filters (KF, EKF, EnKF) and different systems (linear, non-linear) showcases the method's versatility.

Practical Implications and Speculations

WoLF's ability to handle outliers effectively without a significant computational penalty can make it a favorable choice in real-time systems where speed and accuracy are critical, such as in autonomous vehicle navigation, real-time economic forecasting, or adaptive control systems in engineering. Its adaptability to both linear and non-linear systems, as well as its integration with ensemble methods, also positions it well for complex dynamic modeling tasks such as weather forecasting or large-scale environmental modeling.

Future Directions

While the current approach improves robustness against certain types of data issues, several potential areas could be further explored to enhance its capabilities:

  • Handling Non-Gaussian State Processes: Extending robustness to non-Gaussian scenarios in state transitions could widen applicational scopes.
  • Adaptive Weighting Functions: Developing methods to dynamically adjust weighting functions based on real-time data characteristics could improve filter performance and robustness.
  • Integration with Learning Models: Exploring deeper integration with machine learning models, where parameters of the dynamic models are learned from data, could open new avenues in adaptive filtering.

In conclusion, the Weighted Observation Likelihood Filter (WoLF) offers a robust, flexible, and computationally efficient method for enhancing the performance of Kalman Filters in outlier-prone environments, promising substantial benefits across various dynamic data modeling fields.