On the Estimation of Mutual Information (1910.00365v1)
Abstract: In this paper we focus on the estimation of mutual information from finite samples $(\mathcal{X}\times\mathcal{Y})$. The main concern with estimations of mutual information is their robustness under the class of transformations for which it remains invariant: i.e. type I (coordinate transformations), III (marginalizations) and special cases of type IV (embeddings, products). Estimators which fail to meet these standards are not \textit{robust} in their general applicability. Since most machine learning tasks employ transformations which belong to the classes referenced in part I, the mutual information can tell us which transformations are most optimal\cite{Carrara_Ernst}. There are several classes of estimation methods in the literature, such as non-parametric estimators like the one developed by Kraskov et. al\cite{KSG}, and its improved versions\cite{LNC}. These estimators are extremely useful, since they rely only on the geometry of the underlying sample, and circumvent estimating the probability distribution itself. We explore the robustness of this family of estimators in the context of our design criteria.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.