Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian model averaging via mixture model estimation (1711.10016v2)

Published 27 Nov 2017 in stat.ME

Abstract: A new approach for Bayesian model averaging (BMA) and selection is proposed, based on the mixture model approach for hypothesis testing in Kaniav et al., 2014. Inheriting from the good properties of this approach, it extends BMA to cases where improper priors are chosen for parameters that are common to all candidate models. From an algorithmic point of view, our approach consists in sampling from the posterior distribution of the single-datum mixture of all candidate models, weighted by their prior probabilities. We show that this posterior distribution is equal to the 'Bayesian-model averaged' posterior distribution over all candidate models, weighted by their posterior probability. From this BMA posterior sample, a simple Monte-Carlo estimate of each model's posterior probability is derived, as well as importance sampling estimates for expectations under each model's posterior distribution.

Summary

We haven't generated a summary for this paper yet.