Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Higher-Order Entrywise Eigenvectors Analysis of Low-Rank Random Matrices: Bias Correction, Edgeworth Expansion, and Bootstrap (2401.15033v2)

Published 26 Jan 2024 in math.ST and stat.TH

Abstract: Understanding the distributions of spectral estimators in low-rank random matrix models, also known as signal-plus-noise matrix models, is fundamentally important in various statistical learning problems, including network analysis, matrix denoising, and matrix completion. This paper investigates the entrywise eigenvector distributions in a broad range of low-rank signal-plus-noise matrix models by establishing their higher-order accurate stochastic expansions. At a high level, the stochastic expansion states that the eigenvector perturbation approximately decomposes into the sum of a first-order term and a second-order term, where the first-order term in the expansion is a linear function of the noise matrix, and the second-order term is a linear function of the squared noise matrix. Our theoretical finding is used to derive the bias correction procedure for the eigenvectors. We further establish the Edgeworth expansion formula for the studentized entrywise eigenvector statistics. In particular, under mild conditions, we show that Cram\'er's condition on the smoothness of noise distribution is not required, thanks to the self-smoothing effect of the second-order term in the eigenvector stochastic expansion. The Edgeworth expansion result is then applied to justify the higher-order correctness of the residual bootstrap procedure for approximating the distributions of the studentized entrywise eigenvector statistics.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com