Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

How do correlations shape the landscape of information? (2312.00737v3)

Published 1 Dec 2023 in cs.IT, math.AG, math.IT, physics.bio-ph, and q-bio.NC

Abstract: We explore a few common models on how correlations affect information. The main model considered is the Shannon mutual information $I(S:R_1,\cdots, R_i)$ over distributions with marginals $P_{S,R_i}$ fixed for each $i$, with the analogy in which $S$ is the stimulus and $R_i$'s are neurons. We work out basic models in details, using algebro-geometric tools to write down discriminants that separate distributions with distinct qualitative behaviours in the probability simplex into toric chambers and evaluate the volumes of them algebraically. As a byproduct, we provide direct translation between a decomposition of mutual information inspired by a series expansion and one from partial information decomposition (PID) problems, characterising the synergistic terms of the former. We hope this paper serves for communication between communities especially mathematics and theoretical neuroscience on the topic. KEYWORDS: information theory, algebraic statistics, mathematical neuroscience, partial information decomposition

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. Effects of noise correlations on information encoding and decoding. Journal of neurophysiology, 95(6):3633–3644, 2006.
  2. Shun-ichi Amari. Measure of correlation orthogonal to change in firing rate. Neural computation, 21(4):960–972, 2009.
  3. Shun-ichi Amari. Conditional mixture model for correlated neuronal spikes. Neural computation, 22(7):1718–1736, 2010.
  4. Shun-ichi Amari. Information geometry and its applications, volume 194. Springer, 2016.
  5. Methods of information geometry, volume 191. American Mathematical Soc., 2000.
  6. Correlation and independence in the neural code. Neural computation, 18(6):1259–1267, 2006.
  7. Tensors of nonnegative rank two. Linear algebra and its applications, 473:37–53, 2015.
  8. The homological nature of entropy. Entropy, 17(5):3253–3318, 2015.
  9. Quantifying unique information. Entropy, 16(4):2161–2183, 2014.
  10. Topological information data analysis. Entropy, 21(9):869, 2019.
  11. Topological data analysis in information space. J. Comput. Geom., 11(2):162–182, 2020.
  12. William Fulton. Algebraic curves. An Introduction to Algebraic Geom, 54, 2008.
  13. Gröbner bases and polyhedral geometry of reducible and cyclic models. Journal of Combinatorial Theory, Series A, 100(2):277–301, 2002.
  14. Likelihood geometry. Combinatorial algebraic geometry, 2108(63):1305–7462, 2014.
  15. The sign rule and beyond: boundary effects, flexibility, and noise correlations in neural population codes. PLoS computational biology, 10(2):e1003469, 2014.
  16. The MathWorks Inc. Matlab version: 9.14.0 (r2023a), 2023.
  17. Synergy, redundancy, and independence in population codes, revisited. Journal of Neuroscience, 25(21):5195–5206, 2005.
  18. On information rates for mismatched decoders. IEEE Transactions on Information Theory, 40(6):1953–1967, 1994.
  19. Small-correlation expansion to quantify information in noisy sensory systems. Physical Review E, 108(2):024406, 2023.
  20. Natural gradient flow in the mixture geometry of a discrete exponential family. Entropy, 17(6):4215–4254, 2015.
  21. Asymptotic convergence rate of the em algorithm for gaussian mixtures. Neural Computation, 12(12):2881–2907, 2000.
  22. Mismatched decoding in the brain. Journal of Neuroscience, 30(13):4815–4826, 2010.
  23. Correlations and the encoding of information in the nervous system. Proceedings: Biological Sciences, 266(1423):1001–1012, 1999.
  24. An exact method to quantify the information transmitted by different mechanisms of correlational coding. Network: Computation in Neural Systems, 14(1):35, 2003.
  25. Properties of unique information. arXiv preprint arXiv:1912.12505, 2019.
  26. Synergy, redundancy, and independence in population codes. Journal of Neuroscience, 23(37):11539–11553, 2003.
  27. Measuring the component overlapping in the gaussian mixture model. Data mining and knowledge discovery, 23:479–502, 2011.
  28. Han Te Sun. Nonnegative entropy measures of multivariate symmetric correlations. Information and Control, 36:133–156, 1978.
  29. Partial entropy decomposition reveals higher-order structures in human brain activity, 2023.
  30. Nonnegative decomposition of multivariate information. arXiv preprint arXiv:1004.2515, 2010.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com