Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 190 tok/s Pro
GPT OSS 120B 438 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Scalable Bernoulli factories for Bayesian inference with intractable likelihoods (2505.05438v1)

Published 8 May 2025 in stat.CO

Abstract: Bernoulli factory MCMC algorithms implement accept-reject Markov chains without explicit computation of acceptance probabilities, and are used to target posterior distributions associated with intractable likelihood models. These algorithms often mix better than alternatives based on data augmentation or acceptance probability estimation. However, we show that their computational performance typically deteriorates exponentially with data size. To address this, we propose a simple divide-and-conquer Bernoulli factory MCMC algorithm and prove that it has polynomial complexity of degree between 1 and 2, with the exact degree depending on the existence of efficient unbiased estimators of the intractable likelihood ratio. We demonstrate the effectiveness of our approach with applications to Bayesian inference in two intractable likelihood models, and observe respective polynomial cost of degree 1.2 and 1 in the data size.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.