Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Partial Information Decomposition for Continuous Variables based on Shared Exclusions: Analytical Formulation and Estimation (2311.06373v3)

Published 10 Nov 2023 in cs.IT, math.IT, math.PR, math.ST, stat.CO, and stat.TH

Abstract: Describing statistical dependencies is foundational to empirical scientific research. For uncovering intricate and possibly non-linear dependencies between a single target variable and several source variables within a system, a principled and versatile framework can be found in the theory of Partial Information Decomposition (PID). Nevertheless, the majority of existing PID measures are restricted to categorical variables, while many systems of interest in science are continuous. In this paper, we present a novel analytic formulation for continuous redundancy--a generalization of mutual information--drawing inspiration from the concept of shared exclusions in probability space as in the discrete PID definition of $I\mathrm{sx}_\cap$. Furthermore, we introduce a nearest-neighbor based estimator for continuous PID, and showcase its effectiveness by applying it to a simulated energy management system provided by the Honda Research Institute Europe GmbH. This work bridges the gap between the measure-theoretically postulated existence proofs for a continuous $I\mathrm{sx}_\cap$ and its practical application to real-world scientific problems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. Claude E Shannon. A mathematical theory of communication. The Bell System Technical Journal, 27:379–423, 1948.
  2. Transmission of information: A statistical theory of communications. American Journal of Physics, 29(11):793–794, 1961.
  3. Elements of Information Theory, Second Edition. John Wiley & Sons, Ltd, 2005.
  4. Nonnegative decomposition of multivariate information. arXiv Preprint arXiv:1004.2515 [cs.IT], 2010.
  5. Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic. Proceedings of the Royal Society A, 477(2251):20210110, 2021.
  6. Quantifying information modification in developing neural networks via partial information decomposition. Entropy, 19(9):494, 2017.
  7. Information-theoretic analyses of neural data to minimize the effect of researchers’ assumptions in predictive coding studies. arXiv preprint arXiv:2203.10810, 2022.
  8. Partial information decomposition reveals that synergistic neural integration is greater downstream of recurrent information flow in organotypic cortical cultures. PLoS computational biology, 17(7):e1009196, 2021.
  9. Quantifying how much sensory information in a neural code is relevant for behavior. Advances in Neural Information Processing Systems, 30, 2017.
  10. Multiscale information decomposition: Exact computation for multivariate gaussian processes. Entropy, 19(8):408, 2017.
  11. Gaba b receptor-mediated regulation of dendro-somatic synergy in layer 5 pyramidal neurons. Frontiers in cellular neuroscience, 15:718413, 2021.
  12. A synergistic workspace for human consciousness revealed by integrated information decomposition. BioRxiv, pages 2020–11, 2020.
  13. The partial information decomposition of generative neural network models. Entropy, 19(9):474, 2017.
  14. A measure of the complexity of neural representations based on partial information decomposition. Transactions on Machine Learning Research, 2023.
  15. A rigorous information-theoretic definition of redundancy and relevancy in feature selection based on (partial) information decomposition. J. Mach. Learn. Res., 24:131–1, 2023.
  16. Infomorphic networks: Locally learning neural networks derived from partial information decomposition. arXiv preprint arXiv:2306.02149, 2023.
  17. Disentanglement analysis with partial information decomposition. arXiv preprint arXiv:2108.13753, 2021.
  18. Multimodal learning without labeled multimodal data: Guarantees and applications. arXiv preprint arXiv:2306.04539, 2023a.
  19. Factorized contrastive learning: Going beyond multi-view redundancy. arXiv preprint arXiv:2306.05268, 2023b.
  20. Quantifying reinforcement-learning agent’s autonomy, reliance on memory and internalisation of the environment. Entropy, 24(3):401, 2022.
  21. Interaction-aware sensitivity analysis for aerodynamic optimization results using information theory. In 2021 IEEE Symposium Series on Computational Intelligence (SSCI), pages 01–08. IEEE, 2021.
  22. Untangling synergistic effects of intersecting social identities with partial information decomposition. Entropy, 24(10):1387, 2022.
  23. Measuring morphological fusion using partial information decomposition. In Proceedings of the 29th International Conference on Computational Linguistics, pages 44–54, 2022.
  24. Comparing forecast systems with multiple correlation decomposition based on partial correlation. Advances in Statistical Climatology, Meteorology and Oceanography, 6(2):103–113, 2020.
  25. Quantifying unique information. Entropy, 16(4):2161–2183, 2014.
  26. Introducing a differentiable measure of pointwise shared information. Physical Review E - Statistical, Nonlinear, and Soft Matter Physics, 103(3):032149, 2021.
  27. Pointwise partial information decompositionusing the specificity and ambiguity lattices. Entropy, 20(4):297, 2018.
  28. Robin AA Ince. Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19(7):318, 2017.
  29. Artemy Kolchinsky. A novel approach to the partial information decomposition. Entropy, 24(3):403, 2022.
  30. Adam B Barrett. Exploration of synergistic and redundant information sharing in static and dynamical gaussian systems. Physical Review E, 91(5):052802, 2015.
  31. Redundant information neural estimation. Entropy, 23(7):922, 2021.
  32. Estimating the unique information of continuous variables. Advances in Neural Information Processing Systems, 34:20295–20307, 2021.
  33. Signed and unsigned partial information decompositions of continuous network interactions. Journal of Complex Networks, 10(5):cnac026, 2022.
  34. A partial information decomposition for discrete and continuous variables. ArXiv preprint arXiv:2106.12393 [cs.IT], 2021.
  35. Estimating mutual information. Phys. Rev. E, 69:066138, Jun 2004. doi: 10.1103/PhysRevE.69.066138. URL https://link.aps.org/doi/10.1103/PhysRevE.69.066138.
  36. E. T. Jaynes. Probability Theory: The Logic of Science. Cambridge University Press, 2003. doi: 10.1017/CBO9780511790423.
  37. Edwin T Jaynes. Prior probabilities. IEEE Transactions on systems science and cybernetics, 4(3):227–241, 1968.
  38. Théorie des fonctions algébriques et de leurs intégrales étude des fonctions analytiques sur une surface de Riemann, 1895.
  39. Quantifying synergistic mutual information. In Guided self-organization: inception, pages 159–190. Springer, 2014.
  40. Bivariate measure of redundant information. Phys. Rev. E, 87:012130, Jan 2013. doi: 10.1103/PhysRevE.87.012130. URL https://link.aps.org/doi/10.1103/PhysRevE.87.012130.
  41. Causality detection based on information-theoretic approaches in time series analysis. Physics Reports, 441(1):1–46, 2007.
  42. A statistical estimate for the entropy of a random vector. Problemy Peredachi Informatsii, 2(23):9–16, 1987.
  43. Entropy and inference, revisited. Advances in neural information processing systems, 14, 2001.
  44. Demystifying fixed k𝑘kitalic_k -nearest neighbor information estimators. IEEE Transactions on Information Theory, 64(8):5629–5661, 2018. doi: 10.1109/TIT.2018.2807481.
  45. Walter Rudin. Real and Complex Analysis. McGraw-Hill Science/Engineering/Math, 1986.
  46. ”Green Building”—Modelling renewable building energy systems and electric mobility concepts using modelica. In Proceedings of the 9th International MODELICA Conference, Munich, Germany, 2012. Modelica Association.
  47. SimulationX, 2023. URL https://www.esi-group.com/products/system-simulation.
  48. Intersection information based on common randomness. Entropy, 16(4):1985–2000, 2014.
  49. Quantifying redundant information in predicting a target random variable. Entropy, 17(7):4644–4653, 2015.
  50. Aaron Julian Gutknecht. Information, Logic, and Inference in the Analysis of Complex Networks. PhD thesis, Georg-August-Universität Göttingen, 2023.
  51. Walter Rudin. Real and Complex Analysis. McGraw-Hill Book Company, 1987.
  52. Henri Paul Cartan. Differential calculus. (No Title), 1971.
  53. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020. doi: 10.1038/s41592-019-0686-2.
Citations (7)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com