Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 97 tok/s
Gemini 2.5 Pro 39 tok/s Pro
GPT-5 Medium 29 tok/s
GPT-5 High 28 tok/s Pro
GPT-4o 93 tok/s
GPT OSS 120B 462 tok/s Pro
Kimi K2 215 tok/s Pro
2000 character limit reached

On Global-local Shrinkage Priors for Count Data (1907.01333v2)

Published 2 Jul 2019 in stat.ME

Abstract: Global-local shrinkage prior has been recognized as useful class of priors which can strongly shrink small signals towards prior means while keeping large signals unshrunk. Although such priors have been extensively discussed under Gaussian responses, we intensively encounter count responses in practice in which the previous knowledge of global-local shrinkage priors cannot be directly imported. In this paper, we discuss global-local shrinkage priors for analyzing sequence of counts. We provide sufficient conditions under which the posterior mean keeps the observation as it is for very large signals, known as tail robustness property. Then, we propose tractable priors to meet the derived conditions approximately or exactly and develop an efficient posterior computation algorithm for Bayesian inference. The proposed methods are free from tuning parameters, that is, all the hyperparameters are automatically estimated based on the data. We demonstrate the proposed methods through simulation and an application to a real dataset.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run paper prompts using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.