Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 36 tok/s
GPT-5 High 40 tok/s Pro
GPT-4o 99 tok/s
GPT OSS 120B 461 tok/s Pro
Kimi K2 191 tok/s Pro
2000 character limit reached

Maximum Caliber Infers Effective Coupling and Response from Spiking Networks (2405.15206v1)

Published 24 May 2024 in q-bio.NC and physics.bio-ph

Abstract: The characterization of network and biophysical properties from neural spiking activity is an important goal in neuroscience. A framework that provides unbiased inference on causal synaptic interaction and single neural properties has been missing. Here we applied the stochastic dynamics extension of Maximum Entropy -- the Maximum Caliber Principle -- to infer the transition rates of network states. Effective synaptic coupling strength and neuronal response functions for various network motifs can then be computed. The inferred minimal model also enables leading-order reconstruction of inter-spike interval distribution. Our method is tested with numerical simulated spiking networks and applied to data from salamander retina.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. C. Koch, Biophysics of computation: information processing in single neurons (Oxford university press, 2004).
  2. Y. Roudi, E. Aurell, and J. A. Hertz, Statistical physics of pairwise probability models, Frontiers in computational neuroscience 3, 652 (2009).
  3. E. T. Jaynes, Information Theory and Statistical Mechanics, Phys. Rev. 106, 620 (1957).
  4. J. Shore and R. Johnson, Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy, IEEE Trans. Inf. Theor. 26, 26 (1980).
  5. J. van Campenhout and T. Cover, Maximum entropy and conditional probability, IEEE Trans. Inform. Theory 27, 483 (1981).
  6. E. T. Jaynes, The Minimum Entropy Production Principle, Annual Review of Physical Chemistry 31, 579 (1980).
  7. I. Csiszar, T. Cover, and Byoung-Seon Choi, Conditional limit theorems under Markov conditioning, IEEE Trans. Inform. Theory 33, 788 (1987).
  8. R. Chetrite and H. Touchette, Variational and optimal control representations of conditioned and driven processes, J. Stat. Mech. 2015, P12001 (2015).
  9. Y.-J. Yang and H. Qian, Statistical Uncertainty Principle in Stochastic Dynamics (2023), arXiv:2205.09321 [cond-mat].
  10. T. Mora, S. Deny, and O. Marre, Dynamical Criticality in the Collective Activity of a Population of Retinal Neurons, Phys. Rev. Lett. 114, 078105 (2015).
  11. It happens in real data because spike trains there are discrete-time with finite time binning. In true continuous-time spike trains, the probability is zero.
  12. J. M. Horowitz and M. Esposito, Thermodynamics with Continuous Information Flow, Phys. Rev. X 4, 031015 (2014).
  13. D. H. Wolpert, Minimal entropy production rate of interacting systems, New J. Phys. 22, 113013 (2020), publisher: IOP Publishing.
  14. P. Jizba and J. Korbel, Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies, Phys. Rev. Lett. 122, 120601 (2019).
  15. C. E. Shannon, A Mathematical Theory of Communication, Bell Syst. Tech. 27, 379 (1948).
  16. A. Y. Khinchin, Mathematical Foundations of Information Theory, 1st ed. (Dover Publications, Mineola, N.Y, 1957).
  17. C. Tsallis, Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems, Entropy 17, 2853 (2015).
  18. S. Pressé, K. Ghosh, J. Lee, and K. A. Dill, Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”, Entropy 17, 5043 (2015).
  19. J. Lee and S. Pressé, A derivation of the master equation from path entropy maximization, The Journal of Chemical Physics 137, 074103 (2012).
  20. P. W. Anderson, More Is Different, Science 177, 393 (1972).
  21. U. Seifert, From Stochastic Thermodynamics to Thermodynamic Inference, Annual Review of Condensed Matter Physics 10, 171 (2019).
  22. Y.-J. Yang and H. Qian, Unified formalism for entropy production and fluctuation relations, Phys. Rev. E 101, 022129 (2020).
  23. C. Van den Broeck and M. Esposito, Ensemble and trajectory thermodynamics: A brief introduction, Physica A: Statistical Mechanics and its Applications Proceedings of the 13th International Summer School on Fundamental Problems in Statistical Physics, 418, 6 (2015).
  24. C. Jarzynski, Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale, Annual Review of Condensed Matter Physics 2, 329 (2011).
  25. L. Peliti and S. Pigolotti, Stochastic Thermodynamics: An Introduction (Princeton University Press, 2021).
  26. One of us is preparing a manuscript on this.
  27. L. Luo, Architectures of neuronal circuits, Science 373, eabg7285 (2021).
  28. M. C. Morrell, A. J. Sederberg, and I. Nemenman, Latent dynamical variables produce signatures of spatiotemporal criticality in large biological systems, Physical review letters 126, 118302 (2021).
  29. A. Haber and E. Schneidman, Learning the architectural features that predict functional similarity of neural networks, Physical Review X 12, 021051 (2022).
  30. A. A. Prinz, D. Bucher, and E. Marder, Similar network activity from disparate circuit parameters, Nature neuroscience 7, 1345 (2004).
  31. A. Das and I. R. Fiete, Systematic errors in connectivity inferred from activity in strongly recurrent networks, Nature Neuroscience 23, 1286 (2020).
  32. T. Liang and B. A. Brinkman, Statistically inferred neuronal connections in subsampled neural networks strongly correlate with spike train covariances, Physical Review E 109, 044404 (2024).
  33. D. J. Skinner and J. Dunkel, Estimating Entropy Production from Waiting Time Distributions, Phys. Rev. Lett. 127, 198101 (2021), publisher: American Physical Society.
  34. K. S. Chen, Nonequilibrium thermodynamics of input-driven networks, arXiv preprint arXiv:2012.13252  (2020).
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube