Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parallel Markov Chain Monte Carlo for Non-Gaussian Posterior Distributions (1506.03162v1)

Published 10 Jun 2015 in stat.ME

Abstract: Recent developments in big data and analytics research have produced an abundance of large data sets that are too big to be analyzed in their entirety, due to limits on computer memory or storage capacity. To address these issues, communication-free parallel Markov chain Monte Carlo (MCMC) methods have been developed for Bayesian analysis of big data. These methods partition data into manageable subsets, perform independent Bayesian MCMC analysis on each subset, and combine the subset posterior samples to estimate the full data posterior. Current approaches to combining subset posterior samples include sample averaging, weighted averaging, and kernel smoothing techniques. Although these methods work well for Gaussian posteriors, they are not well-suited to non-Gaussian posterior distributions. Here, we develop a new direct density product method for combining subset marginal posterior samples to estimate full data marginal posterior densities. Using a commonly-implemented distance metric, we show in simulation studies of Bayesian models with non-Gaussian posteriors that our method outperforms the existing methods in approximating the full data marginal posteriors. Since our method estimates only marginal densities, there is no limitation on the number of model parameters analyzed. Our procedure is suitable for Bayesian models with unknown parameters with fixed dimension in continuous parameter spaces.

Summary

We haven't generated a summary for this paper yet.