Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Implicit copula variational inference (2111.09511v3)

Published 18 Nov 2021 in stat.ME and stat.CO

Abstract: Key to effective generic, or "black-box", variational inference is the selection of an approximation to the target density that balances accuracy and speed. Copula models are promising options, but calibration of the approximation can be slow for some choices. Smith et al. (2020) suggest using tractable and scalable "implicit copula" models that are formed by element-wise transformation of the target parameters. We propose an adjustment to these transformations that make the approximation invariant to the scale and location of the target density. We also show how a sub-class of elliptical copulas have a generative representation that allows easy application of the re-parameterization trick and efficient first order optimization. We demonstrate the estimation methodology using two statistical models as examples. The first is a mixed effects logistic regression, and the second is a regularized correlation matrix. For the latter, standard Markov chain Monte Carlo estimation methods can be slow or difficult to implement, yet our proposed variational approach provides an effective and scalable estimator. We illustrate by estimating a regularized Gaussian copula model for income inequality in U.S. states between 1917 and 2018. An Online Appendix and MATLAB code to implement the method are available as Supplementary Materials.

Summary

We haven't generated a summary for this paper yet.