Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 94 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 19 tok/s Pro
GPT-5 High 17 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 209 tok/s Pro
GPT OSS 120B 470 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Optimal tail exponents in general last passage percolation via bootstrapping & geodesic geometry (2007.03594v2)

Published 7 Jul 2020 in math.PR, math-ph, and math.MP

Abstract: We consider last passage percolation on $\mathbb Z2$ with general weight distributions, which is expected to be a member of the Kardar-Parisi-Zhang (KPZ) universality class. In this model, an oriented path between given endpoints which maximizes the sum of the i.i.d. weight variables associated to its vertices is called a geodesic. Under natural conditions of curvature of the limiting geodesic weight profile and stretched exponential decay of both tails of the point-to-point weight, we use geometric arguments to upgrade the assumptions to prove optimal upper and lower tail behavior with the exponents of $3/2$ and $3$ for the weight of the geodesic from $(1,1)$ to $(r,r)$ for all large finite $r$. The proofs merge several ideas, including the well known super-additivity property of last passage values, concentration of measure behavior for sums of stretched exponential random variables, and geometric insights coming from the study of geodesics and more general objects called geodesic watermelons. Previously such optimal behavior was only known for exactly solvable models, with proofs relying on hard analysis of formulas from integrable probability, which are unavailable in the general setting. Our results illustrate a facet of universality in a class of KPZ stochastic growth models and provide a geometric explanation of the upper and lower tail exponents of the GUE Tracy-Widom distribution, the conjectured one point scaling limit of such models. The key arguments are based on an observation of general interest that super-additivity allows a natural iterative bootstrapping procedure to obtain improved tail estimates.

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube