Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variational Boosting: Iteratively Refining Posterior Approximations (1611.06585v2)

Published 20 Nov 2016 in stat.ML, cs.LG, and stat.ME

Abstract: We propose a black-box variational inference method to approximate intractable distributions with an increasingly rich approximating class. Our method, termed variational boosting, iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing the practitioner to trade computation time for accuracy. We show how to expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. We apply variational boosting to synthetic and real statistical models, and show that resulting posterior inferences compare favorably to existing posterior approximation algorithms in both accuracy and efficiency.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Andrew C. Miller (17 papers)
  2. Nicholas Foti (5 papers)
  3. Ryan P. Adams (74 papers)
Citations (118)

Summary

We haven't generated a summary for this paper yet.