Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Detecting Change Points of Covariance Matrices in High Dimensions (2409.15588v1)

Published 23 Sep 2024 in math.ST, math.PR, and stat.TH

Abstract: Testing for change points in sequences of high-dimensional covariance matrices is an important and equally challenging problem in statistical methodology with applications in various fields. Motivated by the observation that even in cases where the ratio between dimension and sample size is as small as $0.05$, tests based on a fixed-dimension asymptotics do not keep their preassigned level, we propose to derive critical values of test statistics using an asymptotic regime where the dimension diverges at the same rate as the sample size. This paper introduces a novel and well-founded statistical methodology for detecting change points in a sequence of high-dimensional covariance matrices. Our approach utilizes a min-type statistic based on a sequential process of likelihood ratio statistics. This is used to construct a test for the hypothesis of the existence of a change point with a corresponding estimator for its location. We provide theoretical guarantees for these inference tools by thoroughly analyzing the asymptotic properties of the sequential process of likelihood ratio statistics in the case where the dimension and sample size converge with the same rate to infinity. In particular, we prove weak convergence towards a Gaussian process under the null hypothesis of no change. To identify the challenging dependency structure between consecutive test statistics, we employ tools from random matrix theory and stochastic processes. Moreover, we show that the new test attains power under a class of alternatives reflecting changes in the bulk of the spectrum, and we prove consistency of the estimator for the change-point location.

Summary

We haven't generated a summary for this paper yet.