On the Monotonicity of the Copula Entropy (1611.06714v1)
Abstract: Understanding the way in which random entities interact is of key interest in numerous scientific fields. This can range from a full characterization of the joint distribution to single scalar summary statistics. In this work we identify a novel relationship between the ubiquitous Shannon's mutual information measure and the central tool for capturing real-valued non-Gaussian distributions, namely the framework of copulas. Specifically, we establish a monotonic relationship between the mutual information and the copula dependence parameter, for a wide range of copula families. In addition to the theoretical novelty, our result gives rise to highly efficient proxy to the expected likelihood, which in turn allows for scalable model selection (e.g. when learning probabilistic graphical models).
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.