Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
51 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
10 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Mutual information of Contingency Tables and Related Inequalities (1402.0092v1)

Published 1 Feb 2014 in math.ST, cs.IT, math.IT, and stat.TH

Abstract: For testing independence it is very popular to use either the $\chi{2}$-statistic or $G{2}$-statistics (mutual information). Asymptotically both are $\chi{2}$-distributed so an obvious question is which of the two statistics that has a distribution that is closest to the $\chi{2}$-distribution. Surprisingly the distribution of mutual information is much better approximated by a $\chi{2}$-distribution than the $\chi{2}$-statistic. For technical reasons we shall focus on the simplest case with one degree of freedom. We introduce the signed log-likelihood and demonstrate that its distribution function can be related to the distribution function of a standard Gaussian by inequalities. For the hypergeometric distribution we formulate a general conjecture about how close the signed log-likelihood is to a standard Gaussian, and this conjecture gives much more accurate estimates of the tail probabilities of this type of distribution than previously published results. The conjecture has been proved numerically in all cases relevant for testing independence and further evidence of its validity is given.

Citations (3)

Summary

We haven't generated a summary for this paper yet.