Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian Dynamic Fused LASSO (1905.12275v2)

Published 29 May 2019 in stat.ME

Abstract: The new class of Markov processes is proposed to realize the flexible shrinkage effects for the dynamic models. The transition density of the new process consists of two penalty functions, similarly to Bayesian fused LASSO in its functional form, that shrink the current state variable to its previous value and zero. The normalizing constant of the density, which is not ignorable in the posterior computation, is shown to be essentially the log-geometric mixture of double-exponential densities. This process comprises the state equation of the dynamic regression models, which is shown to be conditionally Gaussian and linear in state variables and utilize the forward filtering and backward sampling in posterior computation by Gibbs sampler. The problem of overshrinkage that is inherent in lasso is moderated by considering the hierarchical extension, which can even realize the shrinkage of horseshoe priors marginally. The new prior is compared with the standard double-exponential prior in the estimation of and prediction by the dynamic linear models for illustration. It is also applied to the time-varying vector autoregressive models for the US macroeconomic data, where we examine the (dis)similarity of the additional shrinkage effect to dynamic variable selection or, specifically, the latent threshold models.

Summary

We haven't generated a summary for this paper yet.