Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Convergence of Black-Box Variational Inference (2305.15349v4)

Published 24 May 2023 in cs.LG, eess.SP, math.OC, stat.CO, and stat.ML

Abstract: We provide the first convergence guarantee for full black-box variational inference (BBVI), also known as Monte Carlo variational inference. While preliminary investigations worked on simplified versions of BBVI (e.g., bounded domain, bounded support, only optimizing for the scale, and such), our setup does not need any such algorithmic modifications. Our results hold for log-smooth posterior densities with and without strong log-concavity and the location-scale variational family. Also, our analysis reveals that certain algorithm design choices commonly employed in practice, particularly, nonlinear parameterizations of the scale of the variational approximation, can result in suboptimal convergence rates. Fortunately, running BBVI with proximal stochastic gradient descent fixes these limitations, and thus achieves the strongest known convergence rate guarantees. We evaluate this theoretical insight by comparing proximal SGD against other standard implementations of BBVI on large-scale Bayesian inference problems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Kyurae Kim (16 papers)
  2. Jisu Oh (5 papers)
  3. Kaiwen Wu (14 papers)
  4. Yi-An Ma (49 papers)
  5. Jacob R. Gardner (39 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.