Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic models, compressible interactions, and neural coding (2112.14334v2)

Published 28 Dec 2021 in q-bio.NC and cond-mat.stat-mech

Abstract: In physics we often use very simple models to describe systems with many degrees of freedom, but it is not clear why or how this success can be transferred to the more complex biological context. We consider models for the joint distribution of many variables, as with the combinations of spiking and silence in large networks of neurons. In this probabilistic framework, we argue that simple models are possible if the mutual information between two halves of the system is consistently sub--extensive, and if this shared information is compressible. These conditions are not met generically, but they are met by real world data such as natural images and the activity in a population of retinal output neurons. We introduce compression strategies that combine the information bottleneck with an iteration scheme inspired by the renormalization group, and find that the number of parameters needed to describe the distribution of joint activity scales with the square of the number of neurons, even though the interactions are not well approximated as pairwise. Our results also show that this shared information is essentially equal to the information that individual neurons carry about natural visual inputs, which has surprising implications for the neural code.

Summary

We haven't generated a summary for this paper yet.