Stochastic Thermodynamics of Associative Memory
Abstract: Dense Associative Memory networks (DenseAMs) unify several popular paradigms in AI, such as Hopfield Networks, transformers, and diffusion models - while casting their computational properties into the language of dynamical systems and energy landscapes. This formulation provides a natural setting for studying thermodynamics and computation in neural systems, because DenseAMs are simultaneously simple enough to admit analytic treatment and rich enough to implement nontrivial computational function. Aspects of these networks have been studied at equilibrium and at zero temperature, but the thermodynamic costs associated with their operation out of equilibrium are largely unexplored. Here, we define the thermodynamic entropy production associated with the operation of such networks, and study polynomial DenseAMs at intermediate memory load. At large system sizes, we use dynamical mean field theory to characterize work requirements and memory transition times when driving the system with corrupted memories. We find tradeoffs between entropy production, memory retrieval accuracy, and operation speed.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.