Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Posterior model probabilities computed from model-specific Gibbs output (1012.0073v2)

Published 1 Dec 2010 in stat.CO, stat.AP, and stat.ME

Abstract: Reversible jump Markov chain Monte Carlo (RJMCMC) extends ordinary MCMC methods for use in Bayesian multimodel inference. We show that RJMCMC can be implemented as Gibbs sampling with alternating updates of a model indicator and a vector-valued "palette" of parameters denoted $\bm \psi$. Like an artist uses the palette to mix dabs of color for specific needs, we create model-specific parameters from the set available in $\bm \psi$. This description not only removes some of the mystery of RJMCMC, but also provides a basis for fitting models one at a time using ordinary MCMC and computing model weights or Bayes factors by post-processing the Monte Carlo output. We illustrate our procedure using several examples.

Summary

We haven't generated a summary for this paper yet.