Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Large deviation-based tuning schemes for Metropolis-Hastings algorithms (2409.20337v2)

Published 30 Sep 2024 in math.PR, math.ST, and stat.TH

Abstract: Markov chain Monte Carlo (MCMC) methods are one of the most popular classes of algorithms for sampling from a target probability distribution. A rising trend in recent years consists in analyzing the convergence of MCMC algorithms using tools from the theory of large deviations. In (Milinanni & Nyquist, 2024), a new framework based on this approach has been developed to study the convergence of empirical measures associated with algorithms of Metropolis-Hastings type, a broad and popular sub-class of MCMC methods. The goal of this paper is to leverage these large deviation results to improve the efficiency of Metropolis-Hastings algorithms. Specifically, we use the large deviations rate function (a central object in large deviation theory) to quantify and characterize the algorithms' speed of convergence. We begin by extending the analysis from (Milinanni & Nyquist, 2024), deriving alternative representations of the rate function. Building on this, we establish explicit upper and lower bounds, which we then use to design schemes to tune Metropolis-Hastings algorithms.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com