Information decomposition in complex systems via machine learning (2307.04755v2)
Abstract: One of the fundamental steps toward understanding a complex system is identifying variation at the scale of the system's components that is most relevant to behavior on a macroscopic scale. Mutual information provides a natural means of linking variation across scales of a system due to its independence of functional relationship between observables. However, characterizing the manner in which information is distributed across a set of observables is computationally challenging and generally infeasible beyond a handful of measurements. Here we propose a practical and general methodology that uses machine learning to decompose the information contained in a set of measurements by jointly optimizing a lossy compression of each measurement. Guided by the distributed information bottleneck as a learning objective, the information decomposition identifies the variation in the measurements of the system state most relevant to specified macroscale behavior. We focus our analysis on two paradigmatic complex systems: a Boolean circuit and an amorphous material undergoing plastic deformation. In both examples, the large amount of entropy of the system state is decomposed, bit by bit, in terms of what is most related to macroscale behavior. The identification of meaningful variation in data, with the full generality brought by information theory, is made practical for studying the connection between micro- and macroscale structure in complex systems.
- (World Scientific), (2012).
- \JournalTitlePhysics Reports 515, 115–226 (2012).
- M Mitchell, Complexity: A guided tour. (Oxford University Press), (2009).
- ME Newman, Complex systems: A survey. \JournalTitlearXiv preprint arXiv:1112.1440 (2011).
- H Matsuda, Physical nature of higher-order mutual information: Intrinsic correlations and frustration. \JournalTitlePhysical review E 62, 3096 (2000).
- \JournalTitleNature Physics 14, 578–582 (2018).
- P Grassberger, Toward a quantitative theory of self-generated complexity. \JournalTitleInternational Journal of Theoretical Physics 25, 907–938 (1986).
- \JournalTitleProceedings of the National Academy of Sciences 91, 5033–5037 (1994).
- \JournalTitleComplexity 2, 44–52 (1996).
- \JournalTitlePhysical Review E 100, 032305 (2019).
- \JournalTitleProceedings of the National Academy of Sciences 119, e2119089119 (2022).
- \JournalTitlenature 521, 436–444 (2015).
- C Rudin, Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead. \JournalTitleNature Machine Intelligence 1, 206–215 (2019).
- \JournalTitleStatistics Surveys 16, 1–85 (2022).
- C Molnar, Interpretable Machine Learning: A Guide for Making Black Box Models Explainable. (2022).
- (2023).
- (ETH Zurich), pp. 35–39 (2018).
- \JournalTitleIEEE Transactions on Pattern Analysis and Machine Intelligence 43, 120–138 (2021).
- \JournalTitlearXiv preprint physics/0004057 (2000).
- JE Savage, Models of computation. (Addison-Wesley Reading, MA) Vol. 136, (1998).
- \JournalTitleIEE Proceedings-Systems Biology 153, 154–167 (2006).
- (Springer), pp. 1–23 (2019).
- \JournalTitleScience 358, 1033–1037 (2017).
- \JournalTitlePhysical Review X 9, 011014 (2019).
- \JournalTitleScience 255, 1523–1531 (1992).
- \JournalTitleNature 396, 21–22 (1998).
- \JournalTitleNature 480, 355–358 (2011).
- \JournalTitleThe Journal of chemical physics 151, 010901 (2019).
- (John Wiley & Sons), (1999).
- CE Shannon, A mathematical theory of communication. \JournalTitleThe Bell System Technical Journal 27, 379–423 (1948).
- pp. 1–6 (2021).
- \JournalTitleInternational Conference on Learning Representations (ICLR) (2017).
- (PMLR), pp. 5171–5180 (2019).
- (PMLR), pp. 875–884 (2020).
- (Curran Associates, Inc.), pp. 4765–4774 (2017).
- \JournalTitleAdvances in Neural Information Processing Systems 33, 17212–17223 (2020).
- \JournalTitleMaterials Science and Engineering 39, 101–109 (1979).
- \JournalTitlePhysical Review E 57, 7192 (1998).
- \JournalTitlePhysical Review Letters 107, 108302 (2011).
- \JournalTitleNature Physics 7, 554–557 (2011).
- \JournalTitlePhysical Review Materials 4, 113609 (2020).
- \JournalTitlePhysical Review E 86, 041505 (2012).
- \JournalTitlePhysical review letters 113, 095703 (2014).
- \JournalTitleNature communications 6, 6089 (2015).
- \JournalTitlePhysical Review Letters 98, 146401 (2007).
- \JournalTitlePhysical Review Letters 114, 108001 (2015).
- \JournalTitleNature Physics 12, 469–471 (2016).
- \JournalTitleProceedings of the National Academy of Sciences 115, 10943–10947 (2018).
- \JournalTitleProceedings of the National Academy of Sciences 114, 10601–10605 (2017).
- \JournalTitlePhysical Review X 11, 041019 (2021).
- \JournalTitlearXiv preprint arXiv:2204.07576 (2022).
- (PMLR), pp. 3744–3753 (2019).
- \JournalTitleJournal of Statistical Mechanics: Theory and Experiment 2019, 124020 (2019).
- \JournalTitlearXiv preprint arXiv:1004.2515 (2010).
- \JournalTitleEntropy 22 (2020).
- \JournalTitleIEEE Journal on Selected Areas in Information Theory 1, 19–38 (2020).
- \JournalTitlePhysical Review Letters 126, 240601 (2021).
- \JournalTitleNew Journal of Physics (2021).
- \JournalTitleNature Communications 10, 1–8 (2019).
- \JournalTitleInternational Conference on Learning Representations (ICLR) (2019).
- \JournalTitleEntropy 21 (2019).
- \JournalTitleProceedings of the Royal Society A 477, 20210110 (2021).
- A Kolchinsky, A novel approach to the partial information decomposition. \JournalTitleEntropy 24, 403 (2022).
- \JournalTitlePhilosophical Transactions of the Royal Society A 380, 20210150 (2022).
- \JournalTitlearXiv preprint arXiv:2209.10438 (2022).
- \JournalTitleJournal of computational neuroscience 36, 119–140 (2014).
- \JournalTitlePhysical Review E 97, 033001 (2018).
- \JournalTitleAdvances in Neural Information Processing Systems 33, 7537–7547 (2020).
- \JournalTitleInternational Organization 67, 889–922 (2013).
- \JournalTitleNature Astronomy 1, 1–5 (2017).
- \JournalTitleJournal of Communication 68, 254–266 (2018).
- \JournalTitlePolitical Analysis 26, 312–327 (2018).
- \JournalTitleNature Neuroscience 23, 918–926 (2020).
- \JournalTitleTrends in Cognitive Sciences 24, 669–672 (2020).
- \JournalTitleNeuron 106, 890–894 (2020).
- \JournalTitleZenodo (2020).
- Z Budrikis, Growing citation gender gap. \JournalTitleNature Reviews Physics 2, 346–346 (2020).