Papers
Topics
Authors
Recent
2000 character limit reached

Bayesian Fusion Estimation via t-Shrinkage (1812.10594v1)

Published 27 Dec 2018 in stat.ME

Abstract: Shrinkage prior has gained great successes in many data analysis, however, its applications mostly focus on the Bayesian modeling of sparse parameters. In this work, we will apply Bayesian shrinkage to model high dimensional parameter that possesses an unknown blocking structure. We propose to impose heavy-tail shrinkage prior, e.g., $t$ prior, on the differences of successive parameter entries, and such a fusion prior will shrink successive differences towards zero and hence induce posterior blocking. Comparing to conventional Bayesian fused lasso which implements Laplace fusion prior, $t$ fusion prior induces stronger shrinkage effect and enjoys a nice posterior consistency property. Simulation studies and real data analyses show that $t$ fusion has superior performance to the frequentist fusion estimator and Bayesian Laplace-fusion prior. This $t$-fusion strategy is further developed to conduct a Bayesian clustering analysis, and simulation shows that the proposed algorithm obtains better posterior distributional convergence than the classical Dirichlet process modeling.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.