EigenVI: score-based variational inference with orthogonal function expansions (2410.24054v1)
Abstract: We develop EigenVI, an eigenvalue-based approach for black-box variational inference (BBVI). EigenVI constructs its variational approximations from orthogonal function expansions. For distributions over $\mathbb{R}D$, the lowest order term in these expansions provides a Gaussian variational approximation, while higher-order terms provide a systematic way to model non-Gaussianity. These approximations are flexible enough to model complex distributions (multimodal, asymmetric), but they are simple enough that one can calculate their low-order moments and draw samples from them. EigenVI can also model other types of random variables (e.g., nonnegative, bounded) by constructing variational approximations from different families of orthogonal functions. Within these families, EigenVI computes the variational approximation that best matches the score function of the target distribution by minimizing a stochastic estimate of the Fisher divergence. Notably, this optimization reduces to solving a minimum eigenvalue problem, so that EigenVI effectively sidesteps the iterative gradient-based optimizations that are required for many other BBVI algorithms. (Gradient-based methods can be sensitive to learning rates, termination criteria, and other tunable hyperparameters.) We use EigenVI to approximate a variety of target distributions, including a benchmark suite of Bayesian models from posteriordb. On these distributions, we find that EigenVI is more accurate than existing methods for Gaussian BBVI.
- PyMC: a modern, and comprehensive probabilistic programming framework in Python. PeerJ Computer Science, 9:e1516, 2023.
- Advances in black-box VI: Normalizing flows, importance weighting, and optimization. Advances in Neural Information Processing Systems, 33, 2020.
- Sylvester normalizing flows for variational inference. Uncertainty in Artificial Intelligence, 2018.
- Pyro: Deep universal probabilistic programming. The Journal of Machine Learning Research, 20(1):973ā978, 2019.
- Variational inference: A review for statisticians. Journal of the American Statistical Association, 112(518):859ā877, 2017.
- Batch and match: black-box variational inference with a score-based divergence. In International Conference on Machine Learning, 2024.
- Stan: A probabilistic programming language. Journal of Statistical Software, 76(1):1ā32, 2017.
- R.Ā Courant and D.Ā Hilbert. Methoden der Mathematischen Physik, volumeĀ 1. Julius Springer, Berlin, 1924.
- Kernel exponential family estimation via doubly dual embedding. In International Conference on Artificial Intelligence and Statistics. PMLR, 2019.
- Robust, accurate stochastic optimization for variational inference. Advances in Neural Information Processing Systems, 33, 2020.
- Challenges and opportunities in high dimensional variational inference. Advances in Neural Information Processing Systems, 34, 2021.
- Density estimation using real NVP. International Conference on Learning Representations, 2017.
- Turing: a language for flexible probabilistic inference. In International Conference on Artificial Intelligence and Statistics. PMLR, 2018.
- Nonparametric variational inference. International Conference on Machine Learning, 2012.
- Black box variational inference with a deterministic objective: Faster, more accurate, and even more black box. Journal of Machine Learning Research, 25(18):1ā39, 2024.
- Introduction to Quantum Mechanics. Cambridge University Press, 2018.
- Boosting variational inference. arXiv preprint arXiv:1611.05559, 2016.
- A. Hyvärinen. Estimation of non-normalized statistical models by score matching. Journal of Machine Learning Research, 6(4), 2005.
- C.Ā Jones and A.Ā Pewsey. Sinh-arcsinh distributions. Biometrika, 96(4):761ā780, 2009.
- C.Ā Jones and A.Ā Pewsey. The sinh-arcsinh normal distribution. Significance, 16(2):6ā7, 2019.
- An introduction to variational methods for graphical models. Machine Learning, 37:183ā233, 1999.
- T.Ā Kim and Y.Ā Bengio. Deep directed generative models with energy-based probability estimation. arXiv preprint arXiv:1606.03439, 2016.
- D.Ā P. Kingma and M.Ā Welling. Auto-encoding variational Bayes.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.