The Hydrodynamic Limit of Neural Networks with Balanced Excitation and Inhibition (2412.17273v1)
Abstract: The theory of `Balanced Neural Networks' is a very popular explanation for the high degree of variability and stochasticity in the brain's activity. We determine equations for the hydrodynamic limit of a balanced all-to-all network of 2n neurons for asymptotically large n. The neurons are divided into two classes (excitatory and inhibitory). Each excitatory neuron excites every other neuron, and each inhibitory neuron inhibits all of the other neurons. The model is of a stochastic hybrid nature, such that the synaptic response of each neuron is governed by an ordinary differential equation. The effect of neuron j on neuron k is dictated by a spiking Poisson Process, with intensity given by a sigmoidal function of the synaptic potentiation of neuron j. The interactions are scaled by n{-1/2} , which is much stronger than the n{-1} scaling of classical interacting particle systems. We demonstrate that, under suitable conditions, the system does not blow up as n asymptotes to infinity because the network activity is balanced between excitatory and inhibitory inputs. The limiting population dynamics is proved to be Gaussian: with the mean determined by the balanced between excitation and inhibition, and the variance determined by the Central Limit Theorem for inhomogeneous Poisson Processes. The limiting equations can thus be expressed as autonomous Ordinary Differential Equations for the means and variances.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.