Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian Mosaic: Parallelizable Composite Posterior (1804.00353v1)

Published 1 Apr 2018 in stat.ME

Abstract: This paper proposes Bayesian mosaic, a parallelizable composite posterior, for scalable Bayesian inference on a broad class of multivariate discrete data models. Sampling is embarrassingly parallel since Bayesian mosaic is a multiplication of component posteriors that can be independently sampled from. Analogous to composite likelihood methods, these component posteriors are based on univariate or bivariate marginal densities. Utilizing the fact that the score functions of these densities are unbiased, we show that Bayesian mosaic is consistent and asymptotically normal under mild conditions. Since the evaluation of univariate or bivariate marginal densities can rely on numerical integration, sampling from Bayesian mosaic bypasses the traditional data augmented Markov chain Monte Carlo (DA-MCMC) method, which has a provably slow mixing rate when data are imbalanced. Moreover, we show that sampling from Bayesian mosaic has better scalability to large sample size than DA-MCMC. The method is evaluated via simulation studies and an application on a citation count dataset.

Summary

We haven't generated a summary for this paper yet.