Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving the convergence of reversible samplers (1601.08118v2)

Published 29 Jan 2016 in math.PR, math-ph, math.MP, and stat.ME

Abstract: In Monte-Carlo methods the Markov processes used to sample a given target distribution usually satisfy detailed balance, i.e. they are time-reversible. However, relatively recent results have demonstrated that appropriate reversible and irreversible perturbations can accelerate convergence to equilibrium. In this paper we present some general design principles which apply to general Markov processes. Working with the generator of Markov processes, we prove that for some of the most commonly used performance criteria, i.e., spectral gap, asymptotic variance and large deviation functionals, sampling is improved for appropriate reversible and irreversible perturbations of some initially given reversible sampler. Moreover we provide specific constructions for such reversible and irreversible perturbations for various commonly used Markov processes, such as Markov chains and diffusions. In the case of diffusions, we make the discussion more specific using the large deviations rate function as a measure of performance.

Summary

We haven't generated a summary for this paper yet.