On the Approximation Accuracy of Gaussian Variational Inference
Abstract: The main computational challenge in Bayesian inference is to compute integrals against a high-dimensional posterior distribution. In the past decades, variational inference (VI) has emerged as a tractable approximation to these integrals, and a viable alternative to the more established paradigm of Markov Chain Monte Carlo. However, little is known about the approximation accuracy of VI. In this work, we bound the TV error and the mean and covariance approximation error of Gaussian VI in terms of dimension and sample size. Our error analysis relies on a Hermite series expansion of the log posterior whose first terms are precisely cancelled out by the first order optimality conditions associated to the Gaussian VI optimization problem.
- Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles. Journal of the American Mathematical Society, 23(2):535–561, 2010.
- Concentration of tempered posteriors and of their variational approximations. The Annals of Statistics, 48(3):1475–1497, 2020.
- Analysis and geometry of Markov diffusion operators, volume 103. Springer, 2014.
- Variational inference: A review for statisticians. Journal of the American Statistical Association, 112(518):859–877, 2017.
- Gaussian kullback-leibler approximate inference. Journal of Machine Learning Research, 14(68):2239–2286, 2013.
- Markov chain Monte Carlo convergence diagnostics: A comparative review. Journal of the American Statistical Association, 91(434):883–904, 1996.
- Mixture weights optimisation for alpha-divergence variational inference. In Advances in Neural Information Processing Systems, volume 34, pages 4397–4408, 2021.
- Infinite-dimensional gradient-based descent for alpha-divergence minimisation. The Annals of Statistics, 49(4):2250–2270, 2021.
- Expectation propagation in the large data limit. Journal of the Royal Statistical Society Series B: Statistical Methodology, 80(1):199–217, 2018.
- Guillaume P Dehaene. A deterministic and computable Bernstein-von Mises theorem. arXiv preprint arXiv:1904.02505, 2019.
- Bounding errors of expectation-propagation. Advances in Neural Information Processing Systems, 28, 2015.
- Forward-backward gaussian variational inference via jko in the bures-wasserstein space. arXiv preprint arXiv:2304.05398, 2023.
- Subhashis Ghosal. Asymptotic normality of posterior distributions for exponential families when the number of parameters tends to infinity. Journal of Multivariate Analysis, 74(1):49–68, 2000.
- Statistical inference in mean-field variational Bayes. arXiv preprint arXiv:1911.01525, 2019.
- Non-asymptotic error estimates for the Laplace approximation in Bayesian inverse problems. Numerische Mathematik, 150(2):521–549, 2022.
- How good is your Gaussian approximation of the posterior? Finite-sample computable error bounds for a variety of useful divergences. arXiv preprint arXiv:2209.14992, 2022.
- Anya Katsevich. Improved dimension dependence in the Bernstein von Mises Theorem via a new Laplace approximation bound. arXiv preprint arXiv:2308.06899, 2023.
- Anya Katsevich. Tight dimension dependence of the Laplace approximation. arXiv preprint arXiv:2305.17604, 2023.
- Anya Katsevich. Tight skew adjustment to the Laplace approximation in high dimensions. arXiv preprint arXiv:2306.07262, 2023.
- Variational inference via Wasserstein gradient flows. arXiv preprint arXiv:2205.15902, 2022.
- Serge Lang. Real and Functional Analysis. Graduate Texts in Mathematics. Springer New York, NY, 3 edition, 1993.
- NÂ Lebedev. Special Functions and Their Applications,. Dover Publications, 1972.
- Yulong Lu. On the Bernstein-von Mises theorem for high dimensional nonlinear Bayesian inverse problems. arXiv preprint arXiv:1706.00289, 2017.
- Vladimir Spokoiny. Bernstein-von Mises theorem for growing parameter dimension. arXiv preprint arXiv:1302.3430, 2013.
- Vladimir Spokoiny. Dimension free nonasymptotic bounds on the accuracy of high-dimensional laplace approximation. SIAM/ASA Journal on Uncertainty Quantification, 11(3):1044–1068, 2023.
- Pragya Sur. A modern maximum-likelihood theory for high-dimensional logistic regression. PhD thesis, Stanford University, 2019.
- Aad W Van der Vaart. Asymptotic statistics, volume 3. Cambridge university press, 2000.
- Frequentist consistency of variational Bayes. Journal of the American Statistical Association, 114(527):1147–1161, 2019.
- Ruby C Weng. A Bayesian Edgeworth expansion by stein’s identity. Bayesian Analysis, 5(4):741–764, 2010.
- Convergence rates of variational posterior distributions. The Annals of Statistics, 48(4):2180–2207, 2020.
- The best rank-1 approximation of a symmetric tensor and related spherical optimization problems. SIAM Journal on Matrix Analysis and Applications, 33(3):806–821, 2012.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.