Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 36 tok/s Pro
GPT-4o 110 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 469 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

A functional large and moderate deviation principle for infinitely divisible processes driven by null-recurrent markov chains (1010.4313v2)

Published 20 Oct 2010 in math.PR, math.ST, and stat.TH

Abstract: Suppose $ E$ is a space with a null-recurrent Markov kernel $ P$. Furthermore, suppose there are infinite particles with variable weights on $ E$ performing a random walk following $ P$. Let $ X_{t}$ be a weighted functional of the position of particles at time $ t$. Under some conditions on the initial distribution of the particles the process $ (X_{t})$ is stationary over time. Non-Gaussian infinitely divisible (ID) distributions turn out to be natural candidates for the initial distribution and then the process $ (X_{t})$ is ID. We prove a functional large and moderate deviation principle for the partial sums of the process $ (X_{t})$. The recurrence of the Markov Kernel $ P$ induces long memory in the process $ (X_{t})$ and that is reflected in the large deviation principle. It has been observed in certain short memory processes that the large deviation principle is very similar to that of an i.i.d. sequence. Whereas, if the process is long range dependent the large deviations change dramatically. We show that a similar phenomenon is observed for infinitely divisible processes driven by Markov chains. Processes of the form of $ (X_{t})$ gives us a rich class of non-Gaussian long memory models which may be useful in practice.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.