Maximum Caliber Infers Effective Coupling and Response from Spiking Networks (2405.15206v1)
Abstract: The characterization of network and biophysical properties from neural spiking activity is an important goal in neuroscience. A framework that provides unbiased inference on causal synaptic interaction and single neural properties has been missing. Here we applied the stochastic dynamics extension of Maximum Entropy -- the Maximum Caliber Principle -- to infer the transition rates of network states. Effective synaptic coupling strength and neuronal response functions for various network motifs can then be computed. The inferred minimal model also enables leading-order reconstruction of inter-spike interval distribution. Our method is tested with numerical simulated spiking networks and applied to data from salamander retina.
- C. Koch, Biophysics of computation: information processing in single neurons (Oxford university press, 2004).
- Y. Roudi, E. Aurell, and J. A. Hertz, Statistical physics of pairwise probability models, Frontiers in computational neuroscience 3, 652 (2009).
- E. T. Jaynes, Information Theory and Statistical Mechanics, Phys. Rev. 106, 620 (1957).
- J. Shore and R. Johnson, Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy, IEEE Trans. Inf. Theor. 26, 26 (1980).
- J. van Campenhout and T. Cover, Maximum entropy and conditional probability, IEEE Trans. Inform. Theory 27, 483 (1981).
- E. T. Jaynes, The Minimum Entropy Production Principle, Annual Review of Physical Chemistry 31, 579 (1980).
- I. Csiszar, T. Cover, and Byoung-Seon Choi, Conditional limit theorems under Markov conditioning, IEEE Trans. Inform. Theory 33, 788 (1987).
- R. Chetrite and H. Touchette, Variational and optimal control representations of conditioned and driven processes, J. Stat. Mech. 2015, P12001 (2015).
- Y.-J. Yang and H. Qian, Statistical Uncertainty Principle in Stochastic Dynamics (2023), arXiv:2205.09321 [cond-mat].
- T. Mora, S. Deny, and O. Marre, Dynamical Criticality in the Collective Activity of a Population of Retinal Neurons, Phys. Rev. Lett. 114, 078105 (2015).
- It happens in real data because spike trains there are discrete-time with finite time binning. In true continuous-time spike trains, the probability is zero.
- J. M. Horowitz and M. Esposito, Thermodynamics with Continuous Information Flow, Phys. Rev. X 4, 031015 (2014).
- D. H. Wolpert, Minimal entropy production rate of interacting systems, New J. Phys. 22, 113013 (2020), publisher: IOP Publishing.
- P. Jizba and J. Korbel, Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies, Phys. Rev. Lett. 122, 120601 (2019).
- C. E. Shannon, A Mathematical Theory of Communication, Bell Syst. Tech. 27, 379 (1948).
- A. Y. Khinchin, Mathematical Foundations of Information Theory, 1st ed. (Dover Publications, Mineola, N.Y, 1957).
- C. Tsallis, Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems, Entropy 17, 2853 (2015).
- S. Pressé, K. Ghosh, J. Lee, and K. A. Dill, Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”, Entropy 17, 5043 (2015).
- J. Lee and S. Pressé, A derivation of the master equation from path entropy maximization, The Journal of Chemical Physics 137, 074103 (2012).
- P. W. Anderson, More Is Different, Science 177, 393 (1972).
- U. Seifert, From Stochastic Thermodynamics to Thermodynamic Inference, Annual Review of Condensed Matter Physics 10, 171 (2019).
- Y.-J. Yang and H. Qian, Unified formalism for entropy production and fluctuation relations, Phys. Rev. E 101, 022129 (2020).
- C. Van den Broeck and M. Esposito, Ensemble and trajectory thermodynamics: A brief introduction, Physica A: Statistical Mechanics and its Applications Proceedings of the 13th International Summer School on Fundamental Problems in Statistical Physics, 418, 6 (2015).
- C. Jarzynski, Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale, Annual Review of Condensed Matter Physics 2, 329 (2011).
- L. Peliti and S. Pigolotti, Stochastic Thermodynamics: An Introduction (Princeton University Press, 2021).
- One of us is preparing a manuscript on this.
- L. Luo, Architectures of neuronal circuits, Science 373, eabg7285 (2021).
- M. C. Morrell, A. J. Sederberg, and I. Nemenman, Latent dynamical variables produce signatures of spatiotemporal criticality in large biological systems, Physical review letters 126, 118302 (2021).
- A. Haber and E. Schneidman, Learning the architectural features that predict functional similarity of neural networks, Physical Review X 12, 021051 (2022).
- A. A. Prinz, D. Bucher, and E. Marder, Similar network activity from disparate circuit parameters, Nature neuroscience 7, 1345 (2004).
- A. Das and I. R. Fiete, Systematic errors in connectivity inferred from activity in strongly recurrent networks, Nature Neuroscience 23, 1286 (2020).
- T. Liang and B. A. Brinkman, Statistically inferred neuronal connections in subsampled neural networks strongly correlate with spike train covariances, Physical Review E 109, 044404 (2024).
- D. J. Skinner and J. Dunkel, Estimating Entropy Production from Waiting Time Distributions, Phys. Rev. Lett. 127, 198101 (2021), publisher: American Physical Society.
- K. S. Chen, Nonequilibrium thermodynamics of input-driven networks, arXiv preprint arXiv:2012.13252 (2020).
Collections
Sign up for free to add this paper to one or more collections.