Context-dependent representation in recurrent neural networks (1506.06602v2)
Abstract: In order to assess the short-term memory performance of non-linear random neural networks, we introduce a measure to quantify the dependence of a neural representation upon the past context. We study this measure both numerically and theoretically using the mean-field theory for random neural networks, showing the existence of an optimal level of synaptic weights heterogeneity. We further investigate the influence of the network topology, in particular the symmetry of reciprocal synaptic connections, on this measure of context dependence, revealing the importance of considering the interplay between non-linearities and connectivity structure.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.