2000 character limit reached
Differentiable Antithetic Sampling for Variance Reduction in Stochastic Variational Inference (1810.02555v2)
Published 5 Oct 2018 in cs.LG and stat.ML
Abstract: Stochastic optimization techniques are standard in variational inference algorithms. These methods estimate gradients by approximating expectations with independent Monte Carlo samples. In this paper, we explore a technique that uses correlated, but more representative , samples to reduce estimator variance. Specifically, we show how to generate antithetic samples that match sample moments with the true moments of an underlying importance distribution. Combining a differentiable antithetic sampler with modern stochastic variational inference, we showcase the effectiveness of this approach for learning a deep generative model.
- Mike Wu (30 papers)
- Noah Goodman (57 papers)
- Stefano Ermon (279 papers)