Papers
Topics
Authors
Recent
2000 character limit reached

Context-dependent representation in recurrent neural networks (1506.06602v2)

Published 22 Jun 2015 in cond-mat.dis-nn and q-bio.NC

Abstract: In order to assess the short-term memory performance of non-linear random neural networks, we introduce a measure to quantify the dependence of a neural representation upon the past context. We study this measure both numerically and theoretically using the mean-field theory for random neural networks, showing the existence of an optimal level of synaptic weights heterogeneity. We further investigate the influence of the network topology, in particular the symmetry of reciprocal synaptic connections, on this measure of context dependence, revealing the importance of considering the interplay between non-linearities and connectivity structure.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Video Overview

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.