Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Practical and Matching Gradient Variance Bounds for Black-Box Variational Bayesian Inference (2303.10472v4)

Published 18 Mar 2023 in cs.LG, math.OC, stat.CO, and stat.ML

Abstract: Understanding the gradient variance of black-box variational inference (BBVI) is a crucial step for establishing its convergence and developing algorithmic improvements. However, existing studies have yet to show that the gradient variance of BBVI satisfies the conditions used to study the convergence of stochastic gradient descent (SGD), the workhorse of BBVI. In this work, we show that BBVI satisfies a matching bound corresponding to the $ABC$ condition used in the SGD literature when applied to smooth and quadratically-growing log-likelihoods. Our results generalize to nonlinear covariance parameterizations widely used in the practice of BBVI. Furthermore, we show that the variance of the mean-field parameterization has provably superior dimensional dependence.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Kyurae Kim (16 papers)
  2. Kaiwen Wu (14 papers)
  3. Jisu Oh (5 papers)
  4. Jacob R. Gardner (39 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.