Papers
Topics
Authors
Recent
2000 character limit reached

A Partial Information Decomposition Based on Causal Tensors

Published 28 Jan 2020 in cs.IT and math.IT | (2001.10481v4)

Abstract: We propose a partial information decomposition based on the newly introduced framework of causal tensors, i.e., multilinear stochastic maps that transform source data into destination data. This framework enables us to express an indirect association in terms of the constituting, direct associations. This is not possible when using average measures like mutual information or transfer entropy. From this, an intuitive definition of redundant and unique information arises. The proposed redundancy satisfies the three axioms stated by introduced by Williams and Beer. The symmetry and self-redundancy properties follow directly from our definition. The Data Processing Inequality ensures that the monotonicity axiom is satisfied. Additional, two other proposed axioms are satisfied: the identity property, and the left monotonicity axiom. Because causal tensors can describe both mutual information as transfer entropy, the proposed partial information decomposition applies to both measures. Results show that the decomposition closely resembles the decomposition of another approach that expresses associations in terms of mutual information a posteriori. It is furthermore demonstrated that negative contributions can arise when our assumptions about completeness of the data set, or what should be included as a source, are incorrect.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.