Asymptotic lower bounds in estimating jumps
Abstract: We study the problem of the efficient estimation of the jumps for stochastic processes. We assume that the stochastic jump process $(X_t){t\in[0,1]}$ is observed discretely, with a sampling step of size $1/n$. In the spirit of Hajek's convolution theorem, we show some lower bounds for the estimation error of the sequence of the jumps $(\Delta X{T_k})k$. As an intermediate result, we prove a LAMN property, with rate $\sqrt{n}$, when the marks of the underlying jump component are deterministic. We deduce then a convolution theorem, with an explicit asymptotic minimal variance, in the case where the marks of the jump component are random. To prove that this lower bound is optimal, we show that a threshold estimator of the sequence of jumps $(\Delta X{T_k})_k$ based on the discrete observations, reaches the minimal variance of the previous convolution theorem.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.