A Measure of Synergy based on Union Information (2403.16575v1)
Abstract: The partial information decomposition (PID) framework is concerned with decomposing the information that a set of (two or more) random variables (the sources) has about another variable (the target) into three types of information: unique, redundant, and synergistic. Classical information theory alone does not provide a unique way to decompose information in this manner and additional assumptions have to be made. One often overlooked way to achieve this decomposition is using a so-called measure of union information - which quantifies the information that is present in at least one of the sources - from which a synergy measure stems. In this paper, we introduce a new measure of union information based on adopting a communication channel perspective, compare it with existing measures, and study some of its properties. We also include a comprehensive critical review of characterizations of union information and synergy measures that have been proposed in the literature.
- Nonnegative decomposition of multivariate information. arXiv preprint arXiv:1004.2515, 2010.
- Towards a synergy-based approach to measuring information modification. In 2013 IEEE Symposium on Artificial Life (ALIFE), pages 43–51. IEEE, 2013.
- Quantifying information modification in developing neural networks via partial information decomposition. Entropy, 19(9):494, 2017.
- J Rauh. Secret sharing and shared information. Entropy, 19(11):601, 2017.
- Consciousness-specific dynamic interactions of brain integration and functional diversity. Nature communications, 10(1):4616, 2019.
- Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex. Communications biology, 6(1):451, 2023.
- Gene regulatory network inference from single-cell data using multivariate information measures. Cell systems, 5(3):251–267, 2017.
- Computation is concentrated in rich clubs of local cortical networks. Network Neuroscience, 3(2):384–404, 2019.
- Modes of information flow. arXiv preprint arXiv:1808.06723, 2018a.
- A measure of the complexity of neural representations based on partial information decomposition. Transactions on Machine Learning Research, 2023.
- Disentanglement analysis with partial information decomposition. arXiv preprint arXiv:2108.13753, 2021.
- T Cover and J Thomas. Elements of information theory. John Wiley & Sons, 1999.
- Bivariate measure of redundant information. Physical Review E, 87(1):012130, 2013.
- Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic. Proceedings of the Royal Society A, 477(2251):20210110, 2021.
- Quantifying unique information. Entropy, 16(4):2161–2183, 2014.
- V Griffith and C Koch. Quantifying synergistic mutual information. In Guided self-organization: inception, pages 159–190. Springer, 2014.
- Unique information via dependency constraints. Journal of Physics A: Mathematical and Theoretical, 52(1):014002, 2018b.
- D Chicharro and S Panzeri. Synergy and redundancy in dual decompositions of mutual information gain and information loss. Entropy, 19(2):71, 2017.
- Shared information—new insights and problems in decomposing information in complex systems. In Proceedings of the European conference on complex systems 2012, pages 251–269. Springer, 2013.
- Coarse-graining and the blackwell order. Entropy, 19(10):527, 2017.
- R Ince. Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19(7):318, 2017.
- A Kolchinsky. A novel approach to the partial information decomposition. Entropy, 24(3):403, 2022.
- A Barrett. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Physical Review E, 91(5):052802, 2015.
- Intersection information based on common randomness. Entropy, 16(4):1985–2000, 2014.
- V Griffith and T Ho. Quantifying redundant information in predicting a target random variable. Entropy, 17(7):4644–4653, 2015.
- Orders between channels and implications for partial information decomposition. Entropy, 25(7):975, 2023.
- Unique information and secret key agreement. Entropy, 21(1):12, 2018c.
- Judea Pearl. Causality. Cambridge university press, 2009.
- Disambiguating the role of blood flow and global signal with partial information decomposition. NeuroImage, 213:116699, 2020.
- Partial information decomposition reveals that synergistic neural integration is greater downstream of recurrent information flow in organotypic cortical cultures. PLoS computational biology, 17(7):e1009196, 2021.
- Correlated activity favors synergistic processing in local cortical networks in vitro at synaptically relevant timescales. Network Neuroscience, 4(3):678–697, 2020.
- Synergistic information supports modality integration and flexible learning in neural networks solving multiple tasks. arXiv preprint arXiv:2210.02996, 2022.
- A comparison of partial information decompositions using data from real and simulated layer 5b pyramidal cells. Entropy, 24(8):1021, 2022.
- Quantifying & modeling feature interactions: An information decomposition framework. arXiv preprint arXiv:2302.12247, 2023.
- Demystifying local and global fairness trade-offs in federated learning using partial information decomposition. arXiv preprint arXiv:2307.11333, 2023.
- From babel to boole: The logical organization of information decompositions. arXiv preprint arXiv:2306.00734, 2023.
- Quantifying synergistic information using intermediate stochastic variables. Entropy, 19(2):85, 2017.
- An operational information decomposition via synergistic disclosure. Journal of Physics A: Mathematical and Theoretical, 53(48):485001, 2020.
- Klaus Krippendorff. Ross ashby’s information theory: a bit of history, some solutions to problems, and what we face today. International journal of general systems, 38(2):189–212, 2009.
- “dit“: a python package for discrete information theory. Journal of Open Source Software, 3(25):738, 2018d.