Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sampling from the Sherrington-Kirkpatrick Gibbs measure via algorithmic stochastic localization (2203.05093v2)

Published 10 Mar 2022 in math.PR, cond-mat.dis-nn, and cs.DS

Abstract: We consider the Sherrington-Kirkpatrick model of spin glasses at high-temperature and no external field, and study the problem of sampling from the Gibbs distribution $\mu$ in polynomial time. We prove that, for any inverse temperature $\beta<1/2$, there exists an algorithm with complexity $O(n2)$ that samples from a distribution $\mu{alg}$ which is close in normalized Wasserstein distance to $\mu$. Namely, there exists a coupling of $\mu$ and $\mu{alg}$ such that if $(x,x{alg})\in{-1,+1}n\times {-1,+1}n$ is a pair drawn from this coupling, then $n{-1}\mathbb E{||x-x{alg}||_22}=o_n(1)$. The best previous results, by Bauerschmidt and Bodineau and by Eldan, Koehler, and Zeitouni, implied efficient algorithms to approximately sample (under a stronger metric) for $\beta<1/4$. We complement this result with a negative one, by introducing a suitable "stability" property for sampling algorithms, which is verified by many standard techniques. We prove that no stable algorithm can approximately sample for $\beta>1$, even under the normalized Wasserstein metric. Our sampling method is based on an algorithmic implementation of stochastic localization, which progressively tilts the measure $\mu$ towards a single configuration, together with an approximate message passing algorithm that is used to approximate the mean of the tilted measure.

Citations (57)

Summary

We haven't generated a summary for this paper yet.