Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MCMC for GLMMs (2204.01866v2)

Published 4 Apr 2022 in stat.ME, stat.AP, and stat.CO

Abstract: Generalized linear mixed models (GLMMs) are often used for analyzing correlated non-Gaussian data. The likelihood function in a GLMM is available only as a high dimensional integral, and thus closed-form inference and prediction are not possible for GLMMs. Since the likelihood is not available in a closed-form, the associated posterior densities in Bayesian GLMMs are also intractable. Generally, Markov chain Monte Carlo (MCMC) algorithms are used for conditional simulation in GLMMs and exploring these posterior densities. In this article, we present different state of the art MCMC algorithms for fitting GLMMs. These MCMC algorithms include efficient data augmentation strategies, as well as diffusions based and Hamiltonian dynamics based methods. The Langevin and Hamiltonian Monte Carlo methods presented here are applicable to any GLMMs, and are illustrated using three most popular GLMMs, namely, the logistic and probit GLMMs for binomial data and the Poisson-log GLMM for count data. We also present efficient data augmentation algorithms for probit and logistic GLMMs. Some of these algorithms are compared using a numerical example.

Summary

We haven't generated a summary for this paper yet.