2000 character limit reached
Variational Bayes Made Easy (2304.14251v2)
Published 27 Apr 2023 in cs.LG, cs.AI, and stat.ML
Abstract: Variational Bayes is a popular method for approximate inference but its derivation can be cumbersome. To simplify the process, we give a 3-step recipe to identify the posterior form by explicitly looking for linearity with respect to expectations of well-known distributions. We can then directly write the update by simply ``reading-off'' the terms in front of those expectations. The recipe makes the derivation easier, faster, shorter, and more general.
- C. M. Bishop. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, Berlin, Heidelberg, 2006. ISBN 0387310738.
- Variational inference: A review for statisticians. Journal of the American statistical Association, 112(518):859–877, 2017.
- G. E. Hinton and D. Van Camp. Keeping the neural networks simple by minimizing the description length of the weights. In Annual Conference on Computational Learning Theory, pages 5–13, 1993.
- Stochastic variational inference. The Journal of Machine Learning Research, 14(1):1303–1347, 2013.
- T. Jaakkola and M. Jordan. A variational approach to Bayesian logistic regression problems and their extensions. In International conference on Artificial Intelligence and Statistics, 1996.
- M. E. Khan and W. Lin. Conjugate-computation variational inference: converting variational inference in non-conjugate models to inferences in conjugate models. In International Conference on Artificial Intelligence and Statistics, pages 878–887, 2017.
- M. E. Khan and H. Rue. The Bayesian learning rule. arXiv preprint arXiv:2107.04562, 2021.
- Faster stochastic variational inference using proximal-gradient methods with general divergence functions. In Proceedings of the Conference on Uncertainty in Artificial Intelligence, 2016.
- The Lie-group Bayesian learning rule. In International conference on Artificial Intelligence and Statistics, 2023.
- Fast and simple natural-gradient variational inference with mixture of exponential-family approximations. 2019.
- Tractable structured natural-gradient descent using local parameterizations. In International Conference on Machine Learning, pages 6680–6691. PMLR, 2021.
- U. Paquet. On the convergence of stochastic variational inference in Bayesian networks. NIPS Workshop on variational inference, 2014.
- Mean field theory for sigmoid belief networks. Journal of Artificial Intelligence Research, 4:61–76, 1996.
- Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 61(3):611–622, 1999.
- J. Winn and C. M. Bishop. Variational message passing. Journal of Machine Learning Research, 6(Apr):661–694, 2005.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.