Uncertainty-Aware Bayes' Rule and Its Applications (2311.05532v3)
Abstract: Bayes' rule has enabled innumerable powerful algorithms of statistical signal processing and statistical machine learning. However, when model misspecifications exist in prior and/or data distributions, the direct application of Bayes' rule is questionable. Philosophically, the key is to balance the relative importance between prior and data distributions when calculating posterior distributions: if prior distributions are overly conservative (i.e., exceedingly spread), we upweight the prior belief; if prior distributions are overly opportunistic (i.e., exceedingly concentrated), we downweight the prior belief. The same operation also applies to data distributions. This paper studies a generalized Bayes' rule, called uncertainty-aware Bayes' rule, to technically realize the above philosophy, thus combating the model uncertainties in prior and/or data distributions. Applications of the uncertainty-aware Bayes' rule on classification and estimation are discussed: In particular, the uncertainty-aware Bayes classifier, the uncertainty-aware Kalman filter, the uncertainty-aware particle filter, and the uncertainty-aware interactive-multiple-model filter are suggested and experimentally validated.
- R. van de Schoot, S. Depaoli, R. King, B. Kramer, K. Märtens, M. G. Tadesse, M. Vannucci, A. Gelman, D. Veen, J. Willemsen et al., “Bayesian statistics and modelling,” Nature Reviews Methods Primers, vol. 1, no. 1, p. 1, 2021.
- J. O. Berger, “An overview of robust Bayesian analysis,” Test, vol. 3, no. 1, pp. 5–124, 1994.
- F. Ruggeri, D. R. Insua, and J. Martín, “Robust Bayesian analysis,” Handbook of Statistics, vol. 25, pp. 623–667, 2005.
- M. A. Medina, J. L. M. Olea, C. Rush, and A. Velez, “On the robustness to misspecification of α𝛼\alphaitalic_α-posteriors and their variational approximations,” The Journal of Machine Learning Research, vol. 23, no. 1, pp. 6579–6629, 2022.
- Y. S. Shmaliy, F. Lehmann, S. Zhao, and C. K. Ahn, “Comparing Robustness of the Kalman, H∞subscript𝐻H_{\infty}italic_H start_POSTSUBSCRIPT ∞ end_POSTSUBSCRIPT, and UFIR Filters,” IEEE Transactions on Signal Processing, vol. 66, no. 13, pp. 3447–3458, 2018.
- S. Wang, “Distributionally Robust State Estimation,” Ph.D. dissertation, National University of Singapore, 2022.
- P. J. Huber, “Robust estimation of a location parameter,” The Annals of Mathematical Statistics, vol. 35, no. 1, pp. 73–101, 1964. [Online]. Available: http://www.jstor.org/stable/2238020
- F. R. Hampel, “The influence curve and its role in robust estimation,” Journal of the American Statistical Association, vol. 69, no. 346, pp. 383–393, 1974.
- P. Alquier and J. Ridgway, “Concentration of tempered posteriors and of their variational approximations,” The Annals of Statistics, vol. 48, no. 3, pp. 1475–1497, 2020. [Online]. Available: https://doi.org/10.1214/19-AOS1855
- S. Wang, “Distributionally robust state estimation for nonlinear systems,” IEEE Transactions on Signal Processing, vol. 70, pp. 4408–4423, 2022.
- V. Bordignon, V. Matta, and A. H. Sayed, “Adaptive social learning,” IEEE Transactions on Information Theory, vol. 67, no. 9, pp. 6053–6081, 2021.
- P. Hu, V. Bordignon, S. Vlaski, and A. H. Sayed, “Optimal aggregation strategies for social learning over graphs,” IEEE Transactions on Information Theory, 2023.
- M. J. Rufo, J. Martín, and C. J. Pérez, “Log-Linear Pool to Combine Prior Distributions: A Suggestion for a Calibration-Based Approach,” Bayesian Analysis, vol. 7, no. 2, pp. 411 – 438, 2012. [Online]. Available: https://doi.org/10.1214/12-BA714
- G. Koliander, Y. El-Laham, P. M. Djurić, and F. Hlawatsch, “Fusion of probability density functions,” Proceedings of the IEEE, vol. 110, no. 4, pp. 404–453, 2022.
- Y. Shen and C. A. Shoemaker, “Global optimization for noisy expensive black-box multi-modal functions via radial basis function surrogate,” in 2020 Winter Simulation Conference (WSC). IEEE, 2020, pp. 3020–3031.
- X. Wang, Y. Jin, S. Schmitt, and M. Olhofer, “Recent advances in Bayesian optimization,” ACM Computing Surveys, vol. 55, no. 13s, pp. 1–36, 2023.
- A. L. Maas, R. E. Daly, P. T. Pham, D. Huang, A. Y. Ng, and C. Potts, “Learning word vectors for sentiment analysis,” in Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, Oregon, USA: Association for Computational Linguistics, June 2011, pp. 142–150. [Online]. Available: http://www.aclweb.org/anthology/P11-1015
- Scikit-learn. Naive Bayes (scikit-learn documentation). [Online]. Available: https://scikit-learn.org/stable/modules/naive_bayes.html
- W. Wolberg, O. Mangasarian, N. Street, and W. Street, “Breast Cancer Wisconsin (Diagnostic),” UCI Machine Learning Repository, 1995, DOI: 10.24432/C5DW2B.
- S. Wang, Z. Wu, and A. Lim, “Robust state estimation for linear systems under distributional uncertainty,” IEEE Transactions on Signal Processing, vol. 69, pp. 5963–5978, 2021.
- S. Wang and Z.-S. Ye, “Distributionally robust state estimation for linear systems subject to uncertainty and outlier,” IEEE Transactions on Signal Processing, vol. 70, pp. 452–467, 2021.
- M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.
- B. P. Carlin, N. G. Polson, and D. S. Stoffer, “A monte carlo approach to nonnormal and nonlinear state-space modeling,” Journal of the American Statistical Association, vol. 87, no. 418, pp. 493–500, 1992.
- H. A. Blom and Y. Bar-Shalom, “The interacting multiple model algorithm for systems with Markovian switching coefficients,” IEEE Transactions on Automatic Control, vol. 33, no. 8, pp. 780–783, 1988.
- Y. Bar-Shalom, S. Challa, and H. A. Blom, “Imm estimator versus optimal estimator for hybrid systems,” IEEE Transactions on Aerospace and Electronic Systems, vol. 41, no. 3, pp. 986–991, 2005.
- X. R. Li and V. P. Jilkov, “Survey of maneuvering target tracking. part i. dynamic models,” IEEE Transactions on Aerospace and Electronic Systems, vol. 39, no. 4, pp. 1333–1364, 2003.
- V. P. Jilkov and X. R. Li, “Online Bayesian estimation of transition probabilities for markovian jump systems,” IEEE Transactions on Signal Processing, vol. 52, no. 6, pp. 1620–1630, 2004.
- S. Zhao and F. Liu, “Recursive estimation for Markov jump linear systems with unknown transition probabilities: A compensation approach,” Journal of the Franklin Institute, vol. 353, no. 7, pp. 1494–1517, 2016.
- S. Wang, “Distributionally robust state estimation for jump linear systems,” IEEE Transactions on Signal Processing, pp. 3835–3851, 2023.
- D. M. Blei, A. Kucukelbir, and J. D. McAuliffe, “Variational inference: A review for statisticians,” Journal of the American Statistical Association, vol. 112, no. 518, pp. 859–877, 2017.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.