Papers
Topics
Authors
Recent
Search
2000 character limit reached

Storage and Learning phase transitions in the Random-Features Hopfield Model

Published 29 Mar 2023 in cond-mat.dis-nn and cs.LG | (2303.16880v2)

Abstract: The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities. Inspired by the manifold hypothesis in machine learning, we propose and investigate a generalization of the standard setting that we name Random-Features Hopfield Model. Here $P$ binary patterns of length $N$ are generated by applying to Gaussian vectors sampled in a latent space of dimension $D$ a random projection followed by a non-linearity. Using the replica method from statistical physics, we derive the phase diagram of the model in the limit $P,N,D\to\infty$ with fixed ratios $\alpha=P/N$ and $\alpha_D=D/N$. Besides the usual retrieval phase, where the patterns can be dynamically recovered from some initial corruption, we uncover a new phase where the features characterizing the projection can be recovered instead. We call this phenomena the learning phase transition, as the features are not explicitly given to the model but rather are inferred from the patterns in an unsupervised fashion.

Citations (10)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.