Papers
Topics
Authors
Recent
2000 character limit reached

Self-Normalization for CUSUM-based Change Detection in Locally Stationary Time Series (2509.07112v1)

Published 8 Sep 2025 in math.ST, stat.ME, and stat.TH

Abstract: A novel self-normalization procedure for CUSUM-based change detection in the mean of a locally stationary time series is introduced. Classical self-normalization relies on the factorization of a constant long-run variance and a stochastic factor. In this case, the CUSUM statistic can be divided by another statistic proportional to the long-run variance, so that the latter cancels. Thereby, a tedious estimation of the long-run variance can be avoided. Under local stationarity, the partial sum process converges to $\int_0t \sigma(x) d B_x$ and no such factorization is possible. To overcome this obstacle, a self-normalized test statistic is constructed from a carefully designed bivariate partial-sum process. Weak convergence of the process is proven, and it is shown that the resulting self-normalized test attains asymptotic level $\alpha$ under the null hypothesis of no change, while being consistent against a broad class of alternatives. Extensive simulations demonstrate better finite-sample properties compared to existing methods. Applications to real data illustrate the method's practical effectiveness.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube