Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Approximate Gibbs sampler for Bayesian Huberized lasso (2204.00237v2)

Published 1 Apr 2022 in stat.ME

Abstract: The Bayesian lasso is well-known as a Bayesian alternative for Lasso. Although the advantage of the Bayesian lasso is capable of full probabilistic uncertain quantification for parameters, the corresponding posterior distribution can be sensitive to outliers. To overcome such problem, robust Bayesian regression models have been proposed in recent years. In this paper, we consider the robust and efficient estimation for the Bayesian Huberized lasso regression in fully Bayesian perspective. A new posterior computation algorithm for the Bayesian Huberized lasso regression is proposed. The proposed approximate Gibbs sampler is based on the approximation of full conditional distribution and it is possible to estimate a tuning parameter for robustness of the pseudo-Huber loss function. Some theoretical properties of the posterior distribution are also derived. We illustrate performance of the proposed method through simulation studies and real data examples.

Summary

We haven't generated a summary for this paper yet.