Improved bounds in Stein's method for functions of multivariate normal random vectors
Abstract: In a paper, Gaunt 2020 extended Stein's method to limit distributions that can be represented as a function $g:\mathbb{R}d\rightarrow\mathbb{R}$ of a centered multivariate normal random vector $\Sigma{1/2}\mathbf{Z}$ with $\mathbf{Z}$ a standard $d$-dimensional multivariate normal random vector and $\Sigma$ a non-negative definite covariance matrix. In this paper, we obtain improved bounds, in the sense of weaker moment conditions, smaller constants and simpler forms, for the case that $g$ has derivatives with polynomial growth. We obtain new non-uniform bounds for the derivatives of the solution of the Stein equation and use these inequalities to obtain general bounds on the distance, measured using smooth test functions, between the distributions of $g(\mathbf{W}_n)$ and $g(\mathbf{Z})$, where $\mathbf{W}_n$ is a standardised sum of random vectors with independent components and $\mathbf{Z}$ is a standard $d$-dimensional multivariate normal random vector. We apply these general bounds to obtain bounds for the chi-square approximation of the family of power divergence statistics (special cases include the Pearson and likelihood ratio statistics), for the case of two cell classifications, that improve on existing results in the literature.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.