Universality of the mean-field equations of networks of Hopfield-like neurons (2408.14290v1)
Abstract: We revisit the problem of characterising the mean-field limit of a network of Hopfield-like neurons. Building on the previous work of \cite{guionnet:95,ben-arous-guionnet:95,guionnet:97} and \cite{dembo_universality_2021} we establish for a large class of networks of Hopfield-like neurons, i.e. rate neurons, the mean-field equations on a time interval $[0,\,T]$, $T>0$, of the thermodynamic limit of these networks, i.e. the limit when the number of neurons goes to infinity. Unlike all previous work, except \cite{dembo_universality_2021}, we do not assume that the synaptic weights describing the connections between the neurons are i.i.d. as zero-mean Gaussians. The limit equations are stochastic and very simply described in terms of two functions, a correlation'' function noted $K_Q(t,\,s)$ and a
mean'' function noted $m_Q(t)$. The ``noise'' part of the equations is a linear function of the Brownian motion, which is obtained by solving a Volterra equation of the second kind whose resolving kernel is expressed as a function of $K_Q$. We give a constructive proof of the uniqueness of the limit equations. We use the corresponding algorithm for an effective computation of the functions $K_Q$ and $m_Q$, given the weights distribution. Several numerical experiments are reported.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.