Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An adaptive mixture-population Monte Carlo method for likelihood-free inference (2112.00420v1)

Published 1 Dec 2021 in math.NA, cs.NA, math.ST, and stat.TH

Abstract: This paper focuses on variational inference with intractable likelihood functions that can be unbiasedly estimated. A flexible variational approximation based on Gaussian mixtures is developed, by adopting the mixture population Monte Carlo (MPMC) algorithm in \cite{cappe2008adaptive}. MPMC updates iteratively the parameters of mixture distributions with importance sampling computations, instead of the complicated gradient estimation of the optimization objective in usual variational Bayes. Noticing that MPMC uses a fixed number of mixture components, which is difficult to predict for real applications, we further propose an automatic component--updating procedure to derive an appropriate number of components. The derived adaptive MPMC algorithm is capable of finding good approximations of the multi-modal posterior distributions even with a standard Gaussian as the initial distribution, as demonstrated in our numerical experiments.

Citations (2)

Summary

We haven't generated a summary for this paper yet.