Paradoxical increase of capacity due to spurious overlaps in attractor networks (2510.17593v1)
Abstract: In Hopfield-type associative memory models, memories are stored in the connectivity matrix and can be retrieved subsequently thanks to the collective dynamics of the network. In these models, the retrieval of a particular memory can be hampered by overlaps between the network state and other memories, termed spurious overlaps since these overlaps collectively introduce noise in the retrieval process. In classic models, spurious overlaps increase the variance of synaptic inputs but do not affect the mean. We show here that in models equipped with a learning rule inferred from neurobiological data, spurious overlaps collectively reduce the mean synaptic inputs to neurons, and that this mean reduction causes in turn an increase in storage capacity through a sparsening of network activity. Our paper demonstrates a link between a specific feature of experimentally inferred plasticity rules and network storage capacity.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.