Papers
Topics
Authors
Recent
Search
2000 character limit reached

Equivalence Principle of the $P$-value and Mutual Information

Published 22 Aug 2023 in math.ST and stat.TH | (2308.14735v1)

Abstract: In this paper, we propose a novel equivalence between probability theory and information theory. For a single random variable, Shannon's self-information, $I=-\log {p}$, is an alternative expression of a probability $p$. However, for two random variables, no information equivalent to the $p$-value has been identified. Here, we prove theorems that demonstrate that mutual information (MI) is equivalent to the $p$-value irrespective of prior information about the distribution of the variables. If the maximum entropy principle can be applied, our equivalence theorems allow us to readily compute the $p$-value from multidimensional MI. By contrast, in a contingency table of any size with known marginal frequencies, our theorem states that MI asymptotically coincides with the logarithm of the $p$-value of Fisher's exact test, divided by the sample size. Accordingly, the theorems enable us to perform a meta-analysis to accurately estimate MI with a low $p$-value, thereby calculating informational interdependence that is robust against sample size variation. Thus, our theorems demonstrate the equivalence of the $p$-value and MI at every dimension, use the merits of both, and provide fundamental information for integrating probability theory and information theory.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.