Papers
Topics
Authors
Recent
2000 character limit reached

On the Estimation of Mutual Information (1910.00365v1)

Published 1 Oct 2019 in physics.data-an

Abstract: In this paper we focus on the estimation of mutual information from finite samples $(\mathcal{X}\times\mathcal{Y})$. The main concern with estimations of mutual information is their robustness under the class of transformations for which it remains invariant: i.e. type I (coordinate transformations), III (marginalizations) and special cases of type IV (embeddings, products). Estimators which fail to meet these standards are not \textit{robust} in their general applicability. Since most machine learning tasks employ transformations which belong to the classes referenced in part I, the mutual information can tell us which transformations are most optimal\cite{Carrara_Ernst}. There are several classes of estimation methods in the literature, such as non-parametric estimators like the one developed by Kraskov et. al\cite{KSG}, and its improved versions\cite{LNC}. These estimators are extremely useful, since they rely only on the geometry of the underlying sample, and circumvent estimating the probability distribution itself. We explore the robustness of this family of estimators in the context of our design criteria.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.