Mixed memories in Hopfield networks (2504.04879v1)
Abstract: We consider the class of Hopfield models of associative memory with activation function $F$ and state space ${-1,1}N$, where each vertex of the cube describes a configuration of $N$ binary neurons. $M$ randomly chosen configurations, called patterns, are stored using an energy function designed to make them local minima. If they are, which is known to depend on how $M$ scales with $N$, then they can be retrieved using a dynamics that decreases the energy. However, storing the patterns in the energy function also creates unintended local minima, and thus false memories. Although this has been known since the earliest work on the subject, it has only been supported by numerical simulations and non-rigorous calculations, except in elementary cases. Our results are twofold. For a generic function $F$, we explicitly construct a set of configurations, called mixed memories, whose properties are intended to characterise the local minima of the energy function. For three prominent models, namely the classical, the dense and the modern Hopfield models, obtained for quadratic, polynomial and exponential functions $F$ respectively, we give conditions on the growth rate of $M$ which guarantee that, as $N$ diverges, mixed memories are fixed points of the retrieval dynamics and thus exact minima of the energy. We conjecture that in this regime, all local minima are mixed memories.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.