Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exact ABC using Importance Sampling (1509.08076v1)

Published 27 Sep 2015 in stat.ME

Abstract: Approximate Bayesian Computation (ABC) is a powerful method for carrying out Bayesian inference when the likelihood is computationally intractable. However, a drawback of ABC is that it is an approximate method that induces a systematic error because it is necessary to set a tolerance level to make the computation tractable. The issue of how to optimally set this tolerance level has been the subject of extensive research. This paper proposes an ABC algorithm based on importance sampling that estimates expectations with respect to the "exact" posterior distribution given the observed summary statistics. This overcomes the need to select the tolerance level. By "exact" we mean that there is no systematic error and the Monte Carlo error can be made arbitrarily small by increasing the number of importance samples. We provide a formal justification for the method and study its convergence properties. The method is illustrated in two applications and the empirical results suggest that the proposed ABC based estimators consistently converge to the true values as the number of importance samples increases. Our proposed approach can be applied more generally to any importance sampling problem where an unbiased estimate of the likelihood is required.

Summary

We haven't generated a summary for this paper yet.