Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adaptive MC^3 and Gibbs algorithms for Bayesian Model Averaging in Linear Regression Models (1306.6028v1)

Published 25 Jun 2013 in stat.CO

Abstract: The MC$3$ (Madigan and York, 1995) and Gibbs (George and McCulloch, 1997) samplers are the most widely implemented algorithms for Bayesian Model Averaging (BMA) in linear regression models. These samplers draw a variable at random in each iteration using uniform selection probabilities and then propose to update that variable. This may be computationally inefficient if the number of variables is large and many variables are redundant. In this work, we introduce adaptive versions of these samplers that retain their simplicity in implementation and reduce the selection probabilities of the many redundant variables. The improvements in efficiency for the adaptive samplers are illustrated in real and simulated datasets.

Summary

We haven't generated a summary for this paper yet.