Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bounds for the chi-square approximation of the power divergence family of statistics (2107.00535v2)

Published 29 Jun 2021 in math.ST, math.PR, and stat.TH

Abstract: It is well-known that each statistic in the family of power divergence statistics, across $n$ trials and $r$ classifications with index parameter $\lambda\in\mathbb{R}$ (the Pearson, likelihood ratio and Freeman-Tukey statistics correspond to $\lambda=1,0,-1/2$, respectively) is asymptotically chi-square distributed as the sample size tends to infinity. In this paper, we obtain explicit bounds on this distributional approximation, measured using smooth test functions, that hold for a given finite sample $n$, and all index parameters ($\lambda>-1$) for which such finite sample bounds are meaningful. We obtain bounds that are of the optimal order $n{-1}$. The dependence of our bounds on the index parameter $\lambda$ and the cell classification probabilities is also optimal, and the dependence on the number of cells is also respectable. Our bounds generalise, complement and improve on recent results from the literature.

Citations (8)

Summary

We haven't generated a summary for this paper yet.