Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A technical report on hitting times, mixing and cutoff (1501.01869v6)

Published 8 Jan 2015 in math.PR

Abstract: Consider a sequence of continuous-time irreducible reversible Markov chains and a sequence of initial distributions, $\mu_n$. The sequence is said to exhibit $\mu_n$-cutoff if the convergence to stationarity in total variation distance is abrupt, w.r.t. this sequence of initial distributions. In this work we give a characterization of $\mu_n$-cutoff for an arbitrary sequence of initial distributions $\mu_n$ (in the above setup). Our characterization is expressed in terms of hitting times of sets which are "worst" w.r.t. $\mu_n$. Consider a Markov chain on $\Omega$ whose stationary distribution in $\pi$. Let $t_{\mathrm{H}}(\alpha) :=\max_{x \in \Omega,A \subset \Omega :\,\pi(A) \ge \alpha}\mathbb{E}{x}[T{A}]$ be the expected hitting time of the worst set of size at least $\alpha$. It was recently proved by Peres and Sousi and independently by Oliveira that $t_{\mathrm{H}}(1/4) $ captures the order of the mixing time. In this work we further refine this connection and show that $\mu_n$-cutoff can be characterized in terms of concentration of hitting times (starting from $\mu_n$) of sets which are worst in expectation w.r.t. $\mu_n$. Conversely, we construct a counter-example which demonstrates that in general cutoff (as opposed to cutoff w.r.t. a certain sequence of initial distributions) cannot be characterized in this manner. Finally, we also prove that there exists an absolute constant $C$ such that for every Markov chain $\epsilon( t_{\mathrm{H}}(\epsilon)-t_{\mathrm{H}}(1-\epsilon)) \le Ct_{\mathrm{rel}} |\log \epsilon|$, for all $0< \epsilon < 1/2$, where $t_{\mathrm{rel}} $ is the inverse of the spectral gap of the chain.

Summary

We haven't generated a summary for this paper yet.