Papers
Topics
Authors
Recent
Search
2000 character limit reached

Second order stochastic gradient update for Cholesky factor in Gaussian variational approximation from Stein's Lemma

Published 19 Oct 2022 in stat.ME | (2210.10566v1)

Abstract: In stochastic variational inference, use of the reparametrization trick for the multivariate Gaussian gives rise to efficient updates for the mean and Cholesky factor of the covariance matrix, which depend on the first order derivative of the log joint model density. In this article, we show that an alternative unbiased gradient estimate for the Cholesky factor which depends on the second order derivative of the log joint model density can be derived using Stein's Lemma. This leads to a second order stochastic gradient update for the Cholesky factor which is able to improve convergence, as it has variance lower than the first order update (almost negligible) when close to the mode. We also derive second order update for the Cholesky factor of the precision matrix, which is useful when the precision matrix has a sparse structure reflecting conditional independence in the true posterior distribution. Our results can be used to obtain second order natural gradient updates for the Cholesky factor as well, which are more robust compared to updates based on Euclidean gradients.

Authors (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.