Quenched Large Deviations for Simple Random Walks on Supercritical Percolation Clusters (1501.02730v3)
Abstract: We prove a {\it{quenched}} large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster on $\Zd$, $d\geq 2$.. We take the point of view of the moving particle and first prove a quenched LDP for the distribution of the {\it{pair empirical measures}} of the environment Markov chain. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit (variational) formulas. Our results are based on invoking ergodicity arguments in this non-elliptic set up to control the growth of {\it{gradient functions (correctors)}} which come up naturally via convex variational analysis in the context of homogenization of random Hamilton Jacobi BeLLMan equations along the arguments of Kosygina, Rezakhanlou and Varadhan (\cite{KRV06}). Although enjoying some similarities, our gradient function is structurally different from {\it{the}} classical {\it{Kipnis-Varadhan corrector}}, a well-studied object in the context of reversible random motions in random media.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.