A scalable, synergy-first backbone decomposition of higher-order structures in complex systems (2402.08135v1)
Abstract: Since its introduction in 2011, the partial information decomposition (PID) has triggered an explosion of interest in the field of multivariate information theory and the study of emergent, higher-order ("synergistic") interactions in complex systems. Despite its power, however, the PID has a number of limitations that restrict its general applicability: it scales poorly with system size and the standard approach to decomposition hinges on a definition of "redundancy", leaving synergy only vaguely defined as "that information not redundant." Other heuristic measures, such as the O-information, have been introduced, although these measures typically only provided a summary statistic of redundancy/synergy dominance, rather than direct insight into the synergy itself. To address this issue, we present an alternative decomposition that is synergy-first, scales much more gracefully than the PID, and has a straightforward interpretation. Our approach defines synergy as that information in a set that would be lost following the minimally invasive perturbation on any single element. By generalizing this idea to sets of elements, we construct a totally ordered "backbone" of partial synergy atoms that sweeps systems scales. Our approach starts with entropy, but can be generalized to the Kullback-Leibler divergence, and by extension, to the total correlation and the single-target mutual information. Finally, we show that this approach can be used to decompose higher-order interactions beyond just information theory: we demonstrate this by showing how synergistic combinations of pairwise edges in a complex network supports signal communicability and global integration. We conclude by discussing how this perspective on synergistic structure (information-based or otherwise) can deepen our understanding of part-whole relationships in complex systems.
- From the origin of life to pandemics: emergent phenomena in complex systems. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 380(2227):20200410, July 2022. Publisher: Royal Society.
- Thomas F. Varley. Information Theory for Complex Systems Scientists, April 2023. arXiv:2304.12482 [physics, q-bio, stat].
- Greater than the parts: a review of the information decomposition approach to causal emergence. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 380(2227):20210246, July 2022. Publisher: Royal Society.
- Emergence as the conversion of information: a unifying theory. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 380(2227):20210150, July 2022. Publisher: Royal Society.
- Thomas F. Varley. Flickering Emergences: The Question of Locality in Information-Theoretic Approaches to Emergence. Entropy, 25(1):54, January 2023. Number: 1 Publisher: Multidisciplinary Digital Publishing Institute.
- Revealing the Dynamics of Neural Information Processing with Multivariate Information Decomposition. Entropy, 24(7):930, July 2022. Number: 7 Publisher: Multidisciplinary Digital Publishing Institute.
- Multivariate information theory uncovers synergistic subsystems of the human cerebral cortex. Communications Biology, 6(1):1–12, April 2023. Number: 1 Publisher: Nature Publishing Group.
- A synergistic core for human brain evolution and cognition. Nature Neuroscience, pages 1–12, May 2022. Publisher: Nature Publishing Group.
- Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables. Water Resources Research, 53(7):5920–5942, 2017.
- Untangling Synergistic Effects of Intersecting Social Identities with Partial Information Decomposition. Entropy, 24(10):1387, October 2022. Number: 10 Publisher: Multidisciplinary Digital Publishing Institute.
- Multiscale partial information decomposition of dynamic processes with short and long-range correlations: theory and application to cardiovascular control. Physiological Measurement, 43(8):085004, August 2022. Publisher: IOP Publishing.
- High-Order Interdependencies in the Aging Brain. Brain Connectivity, April 2021. Publisher: Mary Ann Liebert, Inc., publishers.
- Reduced emergent character of neural dynamics in patients with a disrupted connectome. NeuroImage, 269:119926, April 2023.
- A Synergistic Workspace for Human Consciousness Revealed by Integrated Information Decomposition, March 2023. Pages: 2020.11.25.398081 Section: New Results.
- Nonnegative Decomposition of Multivariate Information. arXiv:1004.2515 [math-ph, physics:physics, q-bio], April 2010. arXiv: 1004.2515.
- Bits and pieces: understanding information decomposition from part-whole relationships and formal logic. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 477(2251):20210110, July 2021. Publisher: Royal Society.
- Quantifying High-order Interdependencies via Multivariate Extensions of the Mutual Information. Physical Review E, 100(3):032305, September 2019. Number: 3 arXiv: 1902.11239.
- Anatomy of a bit: Information in a time series observation. Chaos: An Interdisciplinary Journal of Nonlinear Science, 21(3):037109, September 2011. Publisher: American Institute of Physics.
- Robin A. A. Ince. The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal. arXiv:1702.01591 [cs, math, q-bio, stat], February 2017. arXiv: 1702.01591.
- Generalised Measures of Multivariate Information Content. Entropy, 22(2):216, February 2020. Number: 2 Publisher: Multidisciplinary Digital Publishing Institute.
- Reconciling emergences: An information-theoretic approach to identify causal emergence in multivariate data. PLOS Computational Biology, 16(12):e1008289, December 2020. Number: 12 Publisher: Public Library of Science.
- Thomas F. Varley. Generalized decomposition of multivariate information. PLOS ONE, 19(2):e0297128, February 2024. Publisher: Public Library of Science.
- An operational information decomposition via synergistic disclosure. Journal of Physics A: Mathematical and Theoretical, 53(48):485001, November 2020. Publisher: IOP Publishing.
- Joseph T. Lizier. The Local Information Dynamics of Distributed Computation in Complex Systems. Springer Theses. Springer Berlin Heidelberg, Berlin, Heidelberg, 2013.
- Partial entropy decomposition reveals higher-order information structures in human brain activity. Proceedings of the National Academy of Sciences, 120(30):e2300888120, July 2023. Publisher: Proceedings of the National Academy of Sciences.
- Introducing a differentiable measure of pointwise shared information. Physical Review E, 103(3):032149, March 2021. Publisher: American Physical Society.
- Elements of Information Theory. John Wiley & Sons, November 2012. Google-Books-ID: VWq5GG6ycxMC.
- Joseph T. Lizier. JIDT: An Information-Theoretic Toolkit for Studying the Dynamics of Complex Systems. Frontiers in Robotics and AI, 1, 2014. Publisher: Frontiers.
- A measure for brain complexity: relating functional segregation and integration in the nervous system. Proceedings of the National Academy of Sciences, 91(11):5033–5037, May 1994. Number: 11.
- Adam B. Barrett. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Physical Review E, 91(5):052802, May 2015. Publisher: American Physical Society.
- Jim W. Kay and Robin A. A. Ince. Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints. Entropy, 20(4):240, April 2018. Number: 4 Publisher: Multidisciplinary Digital Publishing Institute.
- Measuring information integration. BMC Neuroscience, 4(1):31, December 2003. Number: 1.
- Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy, 20(3):173, March 2018. Number: 3 Publisher: Multidisciplinary Digital Publishing Institute.
- Evolving higher-order synergies reveals a trade-off between stability and information integration capacity in complex systems, January 2024. arXiv:2401.14347 [nlin].
- L. Brillouin. The Negentropy Principle of Information. Journal of Applied Physics, 24(9):1152–1163, June 2004.
- S. Watanabe. Information Theoretical Analysis of Multivariate Correlation. IBM Journal of Research and Development, 4(1):66–82, January 1960. Number: 1.
- Understanding Interdependency Through Complex Information Sharing. Entropy, 18(2):38, February 2016. Number: 2 Publisher: Multidisciplinary Digital Publishing Institute.
- Towards an extended taxonomy of information dynamics via Integrated Information Decomposition. arXiv:2109.13186 [physics, q-bio], September 2021. arXiv: 2109.13186.
- Brain network communication: concepts, models and applications. Nature Reviews Neuroscience, 24(9):557–574, September 2023. Number: 9 Publisher: Nature Publishing Group.
- Complex network measures of brain connectivity: Uses and interpretations. NeuroImage, 52(3):1059–1069, September 2010. Number: 3.
- Communicability betweenness in complex networks. Physica A: Statistical Mechanics and its Applications, 388(5):764–774, March 2009. Number: 5.
- M. Rosvall and C. T. Bergstrom. Maps of random walks on complex networks reveal community structure. Proceedings of the National Academy of Sciences, 105(4):1118–1123, January 2008. Number: 4.
- The map equation. The European Physical Journal Special Topics, 178(1):13–23, November 2009. Number: 1.
- The Emergence of Informative Higher Scales in Complex Networks. Complexity, 2020:e8932526, 2020. Publisher: Hindawi.
- Evolution and emergence: higher order information structure in protein interactomes across the tree of life. Integrative Biology, page zyab020, December 2021.
- Quantifying synergy and redundancy in multiplex networks, June 2023.
- Link communities reveal multiscale complexity in networks. Nature, 466(7307):761–764, August 2010.
- Living on the edge: network neuroscience beyond nodes. Trends in Cognitive Sciences, September 2023.
- Quantifying Dynamical High-Order Interdependencies From the O-Information: An Application to Neural Spiking Dynamics. Frontiers in Physiology, 11, 2021. Publisher: Frontiers.
- Gradients of O-information: Low-order descriptors of high-order dependencies. Physical Review Research, 5(1):013025, January 2023. Publisher: American Physical Society.
- A measure of statistical complexity based on predictive information with application to finite spin systems. Physics Letters A, 376(4):275–281, January 2012.
- Data Disclosure Under Perfect Sample Privacy. IEEE Transactions on Information Forensics and Security, 15:2012–2025, 2020. Conference Name: IEEE Transactions on Information Forensics and Security.
- Trade-offs in Supply Chain System Risk Mitigation. Systems Research and Behavioral Science, 31(4):565–579, 2014. _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/sres.2299.