A Logarithmic Decomposition and a Signed Measure Space for Entropy (2409.03732v2)
Abstract: The Shannon entropy of a random variable has much behaviour analogous to a signed measure. Previous work has explored this connection by defining a signed measure on abstract sets, which are taken to represent the information that different random variables contain. This construction is sufficient to derive many measure-theoretical counterparts to information quantities such as the mutual information (the intersection of sets), the joint entropy (the union of sets), and the conditional entropy (the difference of sets). Here we provide concrete characterisations of these abstract sets and a corresponding signed measure by extending the approach used by Yeung to all possible outcomes in an outcome space $\Omega$, and in doing so we demonstrate that there exists a much finer decomposition with intuitive properties which we call the logarithmic decomposition (LD). We show that this signed measure space has the useful property that its logarithmic atoms are easily characterised with negative or positive entropy, depending only on their structure, while also being consistent with Yeung's I-measure. We present the usability of our approach by re-examining the G\'acs-K\"orner common information and minimally sufficient statistics from this new geometric perspective and characterising it in terms of our logarithmic atoms -- a property we call logarithmic decomposability. We present possible extensions of this construction to continuous probability distributions before discussing implications for quality-led information theory. As a motivating example, we apply our new decomposition to the Dyadic and Triadic systems of James and Crutchfield and show that, in contrast to the I-measure alone, our decomposition is able to qualitatively distinguish between them.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.