Rates of Fisher information convergence in the central limit theorem for nonlinear statistics (2205.14446v4)
Abstract: We develop a general method to study the Fisher information distance in central limit theorem for nonlinear statistics. We first construct completely new representations for the score function. We then use these representations to derive quantitative estimates for the Fisher information distance. To illustrate the applicability of our approach, explicit rates of Fisher information convergence for quadratic forms and the functions of sample means are provided. For the sums of independent random variables, we obtain the Fisher information bounds without requiring the finiteness of Poincar\'e constant. Our method can also be used to bound the Fisher information distance in non-central limit theorems.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.