Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 84 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 96 tok/s Pro
GPT OSS 120B 462 tok/s Pro
Kimi K2 189 tok/s Pro
2000 character limit reached

Integrated Information Decomposition Unveils Major Structural Traits of $In$ $Silico$ and $In$ $Vitro$ Neuronal Networks (2401.17478v2)

Published 30 Jan 2024 in q-bio.NC and cond-mat.dis-nn

Abstract: The properties of complex networked systems arise from the interplay between the dynamics of their elements and the underlying topology. Thus, to understand their behaviour, it is crucial to convene as much information as possible about their topological organization. However, in a large systems such as neuronal networks, the reconstruction of such topology is usually carried out from the information encoded in the dynamics on the network, such as spike train time series, and by measuring the Transfer Entropy between system elements. The topological information recovered by these methods does not necessarily capture the connectivity layout, but rather the causal flow of information between elements. New theoretical frameworks, such as Integrated Information Decomposition ($\Phi$-ID), allow to explore the modes in which information can flow between parts of a system, opening a rich landscape of interactions between network topology, dynamics and information. Here, we apply $\Phi$-ID on $in$ $silico$ and $in$ $vitro$ data to decompose the usual Transfer Entropy measure into different modes of information transfer, namely synergistic, redundant or unique. We demonstrate that the unique information transfer is the most relevant measure to uncover structural topological details from network activity data, while redundant information only introduces residual information for this application. Although the retrieved network connectivity is still functional, it captures more details of the underlying structural topology by avoiding to take into account emergent high-order interactions and information redundancy between elements, which are important for the functional behavior, but mask the detection of direct simple interactions between elements constituted by the structural network topology.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. D. D. Ghosh, M. N. Nitabach, Y. Zhang,  and G. Harris, “Multisensory integration in c. elegans,” Current Opinion in Neurobiology 43, 110–118 (2017), neurobiology of Learning and Plasticity.
  2. E. Bertin, Statistical Physics of Complex Systems (Springer International Publishing, 2021).
  3. J. T. Lizier, “The local information dynamics of distributed computation in complex systems,”  (2013).
  4. A. B. Barrett, “Exploration of synergistic and redundant information sharing in static and dynamical gaussian systems,” Physical Review E - Statistical, Nonlinear, and Soft Matter Physics 91 (2015), 10.1103/PhysRevE.91.052802.
  5. P. A. M. Mediano, F. E. Rosas, A. I. Luppi, R. L. Carhart-Harris, D. Bor, A. K. Seth,  and A. B. Barrett, “Towards an extended taxonomy of information dynamics via integrated information decomposition,”  (2021), arXiv:2109.13186 [q-bio.NC] .
  6. P. A. M. Mediano, F. E. Rosas, J. C. Farah, M. Shanahan, D. Bor,  and A. B. Barrett, “Integrated information as a common signature of dynamical and information-processing complexity,” Chaos: An Interdisciplinary Journal of Nonlinear Science 32, 013115 (2022).
  7. A. I. Luppi, P. A. M. Mediano, F. E. Rosas, N. Holland, T. D. Fryer, J. T. O’Brien, J. B. Rowe, D. K. Menon, D. Bor,  and E. A. Stamatakis, “A synergistic core for human brain evolution and cognition,” Nature Neuroscience 25, 771–782 (2022).
  8. T. F. Varley, “Decomposing past and future: Integrated information decomposition based on shared probability mass exclusions,” PLOS ONE 18, e0282950 (2023).
  9. O. Stetter, D. Battaglia, J. Soriano,  and T. Geisel, “Model-free reconstruction of excitatory neuronal connectivity from calcium imaging signals,” PLoS Computational Biology  (2012).
  10. J. Soriano, “Neuronal cultures: Exploring biophysics, complex systems, and medicine in a dish,” Biophysica 3, 181–202 (2023).
  11. M. Montalà-Flaquer, C. F. López-León, D. Tornero, A. M. Houben, T. Fardet, P. Monceau, S. Bottani,  and J. Soriano, “Rich dynamics and functional organization on topographically designed neuronal networks in vitro,” Iscience 25, 105680 (2022).
  12. A. A. Ludl and J. Soriano, “Impact of physical obstacles on the structural and effective connectivity of in silico neuronal circuits,” Frontiers in Computational Neuroscience 14, 1–5 (2020).
  13. G. Carola, D. Malagarriga, C. Calatayud, M. Pons-Espinal, L. Blasco-Agell, Y. Richaud-Patin, I. Fernandez-Carasa, V. Baruffi, S. Beltramone, E. Molina, P. Dell’Era, J. J. Toledo-Aral, E. Tolosa, A. R. Muotri, J. Garcia Ojalvo, J. Soriano, A. Raya,  and A. Consiglio, “Parkinson’s disease patient-specific neuronal networks carrying the LRRK2 G2019S mutation unveil early functional alterations that predate neurodegeneration,” npj Parkinson’s Disease 7, 1–14 (2021).
  14. A. Hagberg, P. Swart,  and D. S Chult, “Exploring network structure, dynamics, and function using networkx,” Tech. Rep. (Los Alamos National Lab.(LANL), Los Alamos, NM (United States), 2008).
  15. F. E. Rosas, P. A. Mediano, H. J. Jensen, A. K. Seth, A. B. Barrett, R. L. Carhart-Harris,  and D. Bor, “Reconciling emergences: An information-theoretic approach to identify causal emergence in multivariate data,” PLoS Computational Biology 16 (2020a), 10.1371/journal.pcbi.1008289.
  16. I. Magrans de Abril, J. Yoshimoto,  and K. Doya, “Connectivity inference from neural recording data: Challenges, mathematical bases and research directions,” Neural Networks 102, 120–137 (2018).
  17. A. Banerjee, S. Chandra,  and E. Ott, “Network inference from short, noisy, low time-resolution, partial measurements: Application to c. elegans neuronal calcium dynamics,” Proceedings of the National Academy of Sciences 120, e2216030120 (2023).
  18. J. G. Orlandi, O. Stetter, J. Soriano, T. Geisel,  and D. Battaglia, “Transfer entropy reconstruction and labeling of neuronal connections from simulated calcium imaging,” PloS One 9, e98842 (2014).
  19. J. G. Orlandi, S. Fernández-García, A. Comella-Bolla, M. Masana, G. G.-D. Barriga, M. Yaghoubi, A. Kipp, J. M. Canals, M. A. Colicos, J. Davidsen, et al., “Netcal: an interactive platform for large-scale, network and population dynamics analysis of calcium imaging recordings,” version 7.0.0 Open Beta. Available at Zenodo: https://zenodo.org/records/1119026  (2017).
  20. J. G. Orlandi, J. Soriano, E. Alvarez-Lacalle, S. Teller,  and J. Casademunt, “Noise focusing and the emergence of coherent activity in neuronal cultures,” Nature Physics 9, 582–590 (2013).
  21. E. M. Izhikevich, “Simple model of spiking neurons,” IEEE Transactions on neural networks 14, 1569–1572 (2003).
  22. M. Rubinov and O. Sporns, “Complex network measures of brain connectivity: uses and interpretations,” Neuroimage 52, 1059–1069 (2010).
  23. V. D. Blondel, J.-L. Guillaume, R. Lambiotte,  and E. Lefebvre, “Fast unfolding of communities in large networks,” Journal of statistical mechanics: theory and experiment 2008, P10008 (2008).
  24. D. Balduzzi and G. Tononi, “Integrated information in discrete dynamical systems: Motivation and theoretical framework,” PLoS Computational Biology 4 (2008), 10.1371/journal.pcbi.1000091.
  25. P. A. Mediano, A. K. Seth,  and A. B. Barrett, “Measuring integrated information: Comparison of candidate measures in theory and simulation,” Entropy 21 (2019), 10.3390/e21010017.
  26. T. M. Cover and J. A. Thomas, “Elements of information theory 2nd edition (wiley series in telecommunications and signal processing),” Hardcover (2006).
  27. F. E. Rosas, P. A. Mediano, B. Rassouli,  and A. B. Barrett, “An operational information decomposition via synergistic disclosure,” Journal of Physics A: Mathematical and Theoretical 53 (2020b), 10.1088/1751-8121/abb723.
  28. A. I. Luppi, P. A. Mediano, F. E. Rosas, D. J. Harrison, R. L. Carhart-Harris, D. Bor,  and E. A. Stamatakis, “What it is like to be a bit: an integrated information decomposition account of emergent mental phenomena,” Neuroscience of Consciousness 2021 (2021), 10.1093/nc/niab027.
  29. J. Stone, Information Theory: A Tutorial Introduction, Tutorial Introduction Book (Tutorial Introductions, 2015).
  30. M. Aguilera and E. A. D. Paolo, “Critical integration in neural and cognitive systems: Beyond power-law scaling as the hallmark of soft assembly,” Neuroscience & Biobehavioral Reviews 123, 230–237 (2021).
  31. S. Ito, M. E. Hansen, R. Heiland, A. Lumsdaine, A. M. Litke,  and J. M. Beggs, “Extending transfer entropy improves identification of effective connectivity in a spiking cortical network model,” PLoS ONE 6, e27431 (2011).
  32. A. J. Gutknecht, M. Wibral,  and A. Makkeh, “Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic,” Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 477 (2021), 10.1098/rspa.2021.0110.
Citations (2)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube