Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 89 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 119 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

MCMC-Based Inference in the Era of Big Data: A Fundamental Analysis of the Convergence Complexity of High-Dimensional Chains (1508.00947v2)

Published 5 Aug 2015 in math.ST, math.PR, stat.ME, and stat.TH

Abstract: Markov chain Monte Carlo (MCMC) lies at the core of modern Bayesian methodology, much of which would be impossible without it. Thus, the convergence properties of MCMCs have received significant attention, and in particular, proving (geometric) ergodicity is of critical interest. Trust in the ability of MCMCs to sample from modern-day high-dimensional posteriors, however, has been limited by a widespread perception that these chains typically experience serious convergence problems. In this paper, we first demonstrate that contemporary methods for obtaining convergence rates have serious limitations when the dimension grows. We then propose a framework for rigorously establishing the convergence behavior of commonly used high-dimensional MCMCs. In particular, we demonstrate theoretically the precise nature and severity of the convergence problems of popular MCMCs when implemented in high dimensions, including phase transitions in the convergence rates in various $n$ and $p$ regimes, and a universality result across an entire spectrum of models. We also show that convergence problems effectively eliminate the apparent safeguard of geometric ergodicity. We then demonstrate theoretical principles by which MCMCs can be constructed and analyzed to yield bounded geometric convergence rates even as the dimension $p$ grows without bound. Additionally, we propose a diagnostic tool for establishing convergence.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube