Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Measure of Synergy based on Union Information (2403.16575v1)

Published 25 Mar 2024 in cs.IT and math.IT

Abstract: The partial information decomposition (PID) framework is concerned with decomposing the information that a set of (two or more) random variables (the sources) has about another variable (the target) into three types of information: unique, redundant, and synergistic. Classical information theory alone does not provide a unique way to decompose information in this manner and additional assumptions have to be made. One often overlooked way to achieve this decomposition is using a so-called measure of union information - which quantifies the information that is present in at least one of the sources - from which a synergy measure stems. In this paper, we introduce a new measure of union information based on adopting a communication channel perspective, compare it with existing measures, and study some of its properties. We also include a comprehensive critical review of characterizations of union information and synergy measures that have been proposed in the literature.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. Nonnegative decomposition of multivariate information. arXiv preprint arXiv:1004.2515, 2010.
  2. Towards a synergy-based approach to measuring information modification. In 2013 IEEE Symposium on Artificial Life (ALIFE), pages 43–51. IEEE, 2013.
  3. Quantifying information modification in developing neural networks via partial information decomposition. Entropy, 19(9):494, 2017.
  4. J Rauh. Secret sharing and shared information. Entropy, 19(11):601, 2017.
  5. Consciousness-specific dynamic interactions of brain integration and functional diversity. Nature communications, 10(1):4616, 2019.
  6. Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex. Communications biology, 6(1):451, 2023.
  7. Gene regulatory network inference from single-cell data using multivariate information measures. Cell systems, 5(3):251–267, 2017.
  8. Computation is concentrated in rich clubs of local cortical networks. Network Neuroscience, 3(2):384–404, 2019.
  9. Modes of information flow. arXiv preprint arXiv:1808.06723, 2018a.
  10. A measure of the complexity of neural representations based on partial information decomposition. Transactions on Machine Learning Research, 2023.
  11. Disentanglement analysis with partial information decomposition. arXiv preprint arXiv:2108.13753, 2021.
  12. T Cover and J Thomas. Elements of information theory. John Wiley & Sons, 1999.
  13. Bivariate measure of redundant information. Physical Review E, 87(1):012130, 2013.
  14. Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic. Proceedings of the Royal Society A, 477(2251):20210110, 2021.
  15. Quantifying unique information. Entropy, 16(4):2161–2183, 2014.
  16. V Griffith and C Koch. Quantifying synergistic mutual information. In Guided self-organization: inception, pages 159–190. Springer, 2014.
  17. Unique information via dependency constraints. Journal of Physics A: Mathematical and Theoretical, 52(1):014002, 2018b.
  18. D Chicharro and S Panzeri. Synergy and redundancy in dual decompositions of mutual information gain and information loss. Entropy, 19(2):71, 2017.
  19. Shared information—new insights and problems in decomposing information in complex systems. In Proceedings of the European conference on complex systems 2012, pages 251–269. Springer, 2013.
  20. Coarse-graining and the blackwell order. Entropy, 19(10):527, 2017.
  21. R Ince. Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19(7):318, 2017.
  22. A Kolchinsky. A novel approach to the partial information decomposition. Entropy, 24(3):403, 2022.
  23. A Barrett. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Physical Review E, 91(5):052802, 2015.
  24. Intersection information based on common randomness. Entropy, 16(4):1985–2000, 2014.
  25. V Griffith and T Ho. Quantifying redundant information in predicting a target random variable. Entropy, 17(7):4644–4653, 2015.
  26. Orders between channels and implications for partial information decomposition. Entropy, 25(7):975, 2023.
  27. Unique information and secret key agreement. Entropy, 21(1):12, 2018c.
  28. Judea Pearl. Causality. Cambridge university press, 2009.
  29. Disambiguating the role of blood flow and global signal with partial information decomposition. NeuroImage, 213:116699, 2020.
  30. Partial information decomposition reveals that synergistic neural integration is greater downstream of recurrent information flow in organotypic cortical cultures. PLoS computational biology, 17(7):e1009196, 2021.
  31. Correlated activity favors synergistic processing in local cortical networks in vitro at synaptically relevant timescales. Network Neuroscience, 4(3):678–697, 2020.
  32. Synergistic information supports modality integration and flexible learning in neural networks solving multiple tasks. arXiv preprint arXiv:2210.02996, 2022.
  33. A comparison of partial information decompositions using data from real and simulated layer 5b pyramidal cells. Entropy, 24(8):1021, 2022.
  34. Quantifying & modeling feature interactions: An information decomposition framework. arXiv preprint arXiv:2302.12247, 2023.
  35. Demystifying local and global fairness trade-offs in federated learning using partial information decomposition. arXiv preprint arXiv:2307.11333, 2023.
  36. From babel to boole: The logical organization of information decompositions. arXiv preprint arXiv:2306.00734, 2023.
  37. Quantifying synergistic information using intermediate stochastic variables. Entropy, 19(2):85, 2017.
  38. An operational information decomposition via synergistic disclosure. Journal of Physics A: Mathematical and Theoretical, 53(48):485001, 2020.
  39. Klaus Krippendorff. Ross ashby’s information theory: a bit of history, some solutions to problems, and what we face today. International journal of general systems, 38(2):189–212, 2009.
  40. “dit“: a python package for discrete information theory. Journal of Open Source Software, 3(25):738, 2018d.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com