Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 190 tok/s Pro
GPT OSS 120B 425 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Efficient approximation of branching random walk Gibbs measures (2107.11465v2)

Published 23 Jul 2021 in math.PR, math-ph, and math.MP

Abstract: Disordered systems such as spin glasses have been used extensively as models for high-dimensional random landscapes and studied from the perspective of optimization algorithms. In a paper by L. Addario-Berry and the second author, the continuous random energy model (CREM) was proposed as a simple toy model to study the efficiency of such algorithms. The following question was raised in that paper: what is the threshold $\beta_G$, at which sampling (approximately) from the Gibbs measure at inverse temperature $\beta$ becomes algorithmically hard? This paper is a first step towards answering this question. We consider the branching random walk, a time-homogeneous version of the continuous random energy model. We show that a simple greedy search on a renormalized tree yields a linear-time algorithm which approximately samples from the Gibbs measure, for every $\beta < \beta_c$, the (static) critical point. More precisely, we show that for every $\varepsilon>0$, there exists such an algorithm such that the specific relative entropy between the law sampled by the algorithm and the Gibbs measure of inverse temperature $\beta$ is less than $\varepsilon$ with high probability. In the supercritical regime $\beta > \beta_c$, we provide the following hardness result. Under a mild regularity condition, for every $\delta > 0$, there exists $z>0$ such that the running time of any given algorithm approximating the Gibbs measure stochastically dominates a geometric random variable with parameter $e{-z\sqrt{N}}$ on an event with probability at least $1-\delta$.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.