Papers
Topics
Authors
Recent
Search
2000 character limit reached

A fully Bayesian strategy for high-dimensional hierarchical modeling using massively parallel computing

Published 21 Jun 2016 in stat.CO | (1606.06659v1)

Abstract: Markov chain Monte Carlo (MCMC) is the predominant tool used in Bayesian parameter estimation for hierarchical models. When the model expands due to an increasing number of hierarchical levels, number of groups at a particular level, or number of observations in each group, a fully Bayesian analysis via MCMC can easily become computationally demanding, even intractable. We illustrate how the steps in an MCMC for hierarchical models are predominantly one of two types: conditionally independent draws or low-dimensional draws based on summary statistics of parameters at higher levels of the hierarchy. Parallel computing can increase efficiency by performing embarrassingly parallel computations for conditionally independent draws and calculating the summary statistics using parallel reductions. During the MCMC algorithm, we record running means and means of squared parameter values to allow convergence diagnosis and posterior inference while avoiding the costly memory transfer bottleneck. We demonstrate the effectiveness of the algorithm on a model motivated by next generation sequencing data, and we release our implementation in R packages fbseq and fbseqCUDA.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.