Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 61 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 36 tok/s Pro
GPT-4o 129 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 474 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Sample-Path Large Deviations for Lévy Processes and Random Walks with Lognormal Increments (2410.20799v1)

Published 28 Oct 2024 in math.PR

Abstract: The large deviations theory for heavy-tailed processes has seen significant advances in the recent past. In particular, Rhee et al. (2019) and Bazhba et al. (2020) established large deviation asymptotics at the sample-path level for L\'evy processes and random walks with regularly varying and (heavy-tailed) Weibull-type increments. This leaves the lognormal case -- one of the three most prominent classes of heavy-tailed distributions, alongside regular variation and Weibull -- open. This article establishes the \emph{extended large deviation principle} (extended LDP) at the sample-path level for one-dimensional L\'evy processes and random walks with lognormal-type increments. Building on these results, we also establish the extended LDPs for multi-dimensional processes with independent coordinates. We demonstrate the sharpness of these results by constructing counterexamples, thereby proving that our results cannot be strengthened to a standard LDP under $J_1$ topology and $M_1'$ topology.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.