Bayesian model averaging via mixture model estimation (1711.10016v2)
Abstract: A new approach for Bayesian model averaging (BMA) and selection is proposed, based on the mixture model approach for hypothesis testing in Kaniav et al., 2014. Inheriting from the good properties of this approach, it extends BMA to cases where improper priors are chosen for parameters that are common to all candidate models. From an algorithmic point of view, our approach consists in sampling from the posterior distribution of the single-datum mixture of all candidate models, weighted by their prior probabilities. We show that this posterior distribution is equal to the 'Bayesian-model averaged' posterior distribution over all candidate models, weighted by their posterior probability. From this BMA posterior sample, a simple Monte-Carlo estimate of each model's posterior probability is derived, as well as importance sampling estimates for expectations under each model's posterior distribution.