Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 33 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 74 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 362 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Variational Bayes Made Easy (2304.14251v2)

Published 27 Apr 2023 in cs.LG, cs.AI, and stat.ML

Abstract: Variational Bayes is a popular method for approximate inference but its derivation can be cumbersome. To simplify the process, we give a 3-step recipe to identify the posterior form by explicitly looking for linearity with respect to expectations of well-known distributions. We can then directly write the update by simply ``reading-off'' the terms in front of those expectations. The recipe makes the derivation easier, faster, shorter, and more general.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. C. M. Bishop. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, Berlin, Heidelberg, 2006. ISBN 0387310738.
  2. Variational inference: A review for statisticians. Journal of the American statistical Association, 112(518):859–877, 2017.
  3. G. E. Hinton and D. Van Camp. Keeping the neural networks simple by minimizing the description length of the weights. In Annual Conference on Computational Learning Theory, pages 5–13, 1993.
  4. Stochastic variational inference. The Journal of Machine Learning Research, 14(1):1303–1347, 2013.
  5. T. Jaakkola and M. Jordan. A variational approach to Bayesian logistic regression problems and their extensions. In International conference on Artificial Intelligence and Statistics, 1996.
  6. M. E. Khan and W. Lin. Conjugate-computation variational inference: converting variational inference in non-conjugate models to inferences in conjugate models. In International Conference on Artificial Intelligence and Statistics, pages 878–887, 2017.
  7. M. E. Khan and H. Rue. The Bayesian learning rule. arXiv preprint arXiv:2107.04562, 2021.
  8. Faster stochastic variational inference using proximal-gradient methods with general divergence functions. In Proceedings of the Conference on Uncertainty in Artificial Intelligence, 2016.
  9. The Lie-group Bayesian learning rule. In International conference on Artificial Intelligence and Statistics, 2023.
  10. Fast and simple natural-gradient variational inference with mixture of exponential-family approximations. 2019.
  11. Tractable structured natural-gradient descent using local parameterizations. In International Conference on Machine Learning, pages 6680–6691. PMLR, 2021.
  12. U. Paquet. On the convergence of stochastic variational inference in Bayesian networks. NIPS Workshop on variational inference, 2014.
  13. Mean field theory for sigmoid belief networks. Journal of Artificial Intelligence Research, 4:61–76, 1996.
  14. Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 61(3):611–622, 1999.
  15. J. Winn and C. M. Bishop. Variational message passing. Journal of Machine Learning Research, 6(Apr):661–694, 2005.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.