Measuring the Redundancy of Information from a Source Failure Perspective (2404.01470v1)
Abstract: In this paper, we define a new measure of the redundancy of information from a fault tolerance perspective. The partial information decomposition (PID) emerged last decade as a framework for decomposing the multi-source mutual information $I(T;X_1, ..., X_n)$ into atoms of redundant, synergistic, and unique information. It built upon the notion of redundancy/synergy from McGill's interaction information (McGill 1954). Separately, the redundancy of system components has served as a principle of fault tolerant engineering, for sensing, routing, and control applications. Here, redundancy is understood as the level of duplication necessary for the fault tolerant performance of a system. With these two perspectives in mind, we propose a new PID-based measure of redundancy $I_{\text{ft}}$, based upon the presupposition that redundant information is robust to individual source failures. We demonstrate that this new measure satisfies the common PID axioms from (Williams 2010). In order to do so, we establish an order-reversing correspondence between collections of source-fallible instantiations of a system, on the one hand, and the PID lattice from (Williams 2010), on the other.
- W. McGill, “Multivariate information transmission,” Transactions of the IRE Professional Group on Information Theory, vol. 4, no. 4, pp. 93–111, 1954.
- P. L. Williams and R. D. Beer, “Nonnegative decomposition of multivariate information,” arXiv preprint arXiv:1004.2515, 2010.
- A. Rullo, E. Serra, and J. Lobo, “Redundancy as a measure of fault-tolerance for the internet of things: A review,” Policy-Based Autonomic Data Governance, pp. 202–226, 2019.
- K. Marzullo, “Tolerating failures of continuous-valued sensors,” ACM Transactions on Computer Systems (TOCS), vol. 8, no. 4, pp. 284–304, 1990.
- S. Watanabe, “Information theoretical analysis of multivariate correlation,” IBM Journal of Research and Development, vol. 4, no. 1, pp. 66–82, 1960.
- E. Schneidman, W. Bialek, and M. J. B. II, “Synergy, redundancy, and independence in population codes,” Journal of Neuroscience, vol. 23, no. 37, pp. 11539–11553, 2003.
- N. Timme, W. Alford, B. Flecker, and J. M. Beggs, “Synergy, redundancy, and multivariate information measures: an experimentalist’s perspective,” Journal of computational neuroscience, vol. 36, pp. 119–140, 2014.
- G. Chechik, A. Globerson, M. Anderson, E. Young, I. Nelken, and N. Tishby, “Group redundancy measures reveal redundancy reduction in the auditory pathway,” Advances in neural information processing systems, vol. 14, 2001.
- N. Bertschinger, J. Rauh, E. Olbrich, J. Jost, and N. Ay, “Quantifying unique information,” Entropy, vol. 16, no. 4, pp. 2161–2183, 2014.
- C. Finn and J. T. Lizier, “Pointwise partial information decomposition using the specificity and ambiguity lattices,” Entropy, vol. 20, no. 4, 2018.
- J. T. Lizier, N. Bertschinger, J. Jost, and M. Wibral, “Information decomposition of target effects from multi-source interactions: Perspectives on previous, current and future work,” 2018.
- D. A. Ehrlich, A. C. Schneider, V. Priesemann, M. Wibral, and A. Makkeh, “A measure of the complexity of neural representations based on partial information decomposition,” Transactions on Machine Learning Research, vol. 5, 2023.
- P. P. Liang, Y. Cheng, X. Fan, C. K. Ling, S. Nie, R. J. Chen, Z. Deng, N. Allen, R. Auerbach, F. Mahmood, et al., “Quantifying & modeling multimodal interactions: An information decomposition framework,” in Thirty-seventh Conference on Neural Information Processing Systems, 2023.
- R. A. Ince, “Measuring multivariate redundant information with pointwise common change in surprisal,” Entropy, vol. 19, no. 7, p. 318, 2017.
- A. Makkeh, A. J. Gutknecht, and M. Wibral, “Introducing a differentiable measure of pointwise shared information,” Physical Review E, vol. 103, no. 3, p. 032149, 2021.
- J. Crampton and G. Loizou, “Two partial orders on the set of antichains,” Research note, September 2000.