Papers
Topics
Authors
Recent
2000 character limit reached

Posterior model probabilities computed from model-specific Gibbs output

Published 1 Dec 2010 in stat.CO, stat.AP, and stat.ME | (1012.0073v2)

Abstract: Reversible jump Markov chain Monte Carlo (RJMCMC) extends ordinary MCMC methods for use in Bayesian multimodel inference. We show that RJMCMC can be implemented as Gibbs sampling with alternating updates of a model indicator and a vector-valued "palette" of parameters denoted $\bm \psi$. Like an artist uses the palette to mix dabs of color for specific needs, we create model-specific parameters from the set available in $\bm \psi$. This description not only removes some of the mystery of RJMCMC, but also provides a basis for fitting models one at a time using ordinary MCMC and computing model weights or Bayes factors by post-processing the Monte Carlo output. We illustrate our procedure using several examples.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.