Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stochastic Thermodynamics of Associative Memory

Published 3 Jan 2026 in cond-mat.stat-mech | (2601.01253v1)

Abstract: Dense Associative Memory networks (DenseAMs) unify several popular paradigms in AI, such as Hopfield Networks, transformers, and diffusion models - while casting their computational properties into the language of dynamical systems and energy landscapes. This formulation provides a natural setting for studying thermodynamics and computation in neural systems, because DenseAMs are simultaneously simple enough to admit analytic treatment and rich enough to implement nontrivial computational function. Aspects of these networks have been studied at equilibrium and at zero temperature, but the thermodynamic costs associated with their operation out of equilibrium are largely unexplored. Here, we define the thermodynamic entropy production associated with the operation of such networks, and study polynomial DenseAMs at intermediate memory load. At large system sizes, we use dynamical mean field theory to characterize work requirements and memory transition times when driving the system with corrupted memories. We find tradeoffs between entropy production, memory retrieval accuracy, and operation speed.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.