Papers
Topics
Authors
Recent
2000 character limit reached

On the convergence complexity of Gibbs samplers for a family of simple Bayesian random effects models

Published 29 Apr 2020 in math.ST and stat.TH | (2004.14330v2)

Abstract: The emergence of big data has led to so-called convergence complexity analysis, which is the study of how Markov chain Monte Carlo (MCMC) algorithms behave as the sample size, $n$, and/or the number of parameters, $p$, in the underlying data set increase. This type of analysis is often quite challenging, in part because existing results for fixed $n$ and $p$ are simply not sharp enough to yield good asymptotic results. One of the first convergence complexity results for an MCMC algorithm on a continuous state space is due to Yang and Rosenthal (2019), who established a mixing time result for a Gibbs sampler (for a simple Bayesian random effects model) that was introduced and studied by Rosenthal (1996). The asymptotic behavior of the spectral gap of this Gibbs sampler is, however, still unknown. We use a recently developed simulation technique (Qin et. al., 2019) to provide substantial numerical evidence that the gap is bounded away from 0 as $n \rightarrow \infty$. We also establish a pair of rigorous convergence complexity results for two different Gibbs samplers associated with a generalization of the random effects model considered by Rosenthal (1996). Our results show that, under strong regularity conditions, the spectral gaps of these Gibbs samplers converge to 1 as the sample size increases.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.