Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The importance of being empty: a spectral approach to Hopfield neural networks with diluted examples (2503.15353v1)

Published 19 Mar 2025 in cond-mat.dis-nn, math-ph, and math.MP

Abstract: We consider Hopfield networks, where neurons interact pair-wise by Hebbian couplings built over $i$. a set of definite patterns (ground truths), $ii$. a sample of labeled examples (supervised setting), $iii$. a sample of unlabeled examples (unsupervised setting). We focus on the case where ground-truths are Rademacher vectors and examples are noisy versions of these ground-truths, possibly displaying some blank entries (e.g., mimicking missing or dropped data), and we determine the spectral distribution of the coupling matrices in the three scenarios, by exploiting and extending the Marchenko-Pastur theorem. By levering this knowledge, we are able to analytically inspect the stability and attractiveness of the ground truths, as well as the generalization capabilities of the networks. In particular, as corroborated by long-running Monte Carlo simulations, the presence of black entries can have benefits in some specific conditions, suggesting strategies based on data sparsification; the robustness of these results in structured datasets is confirmed numerically. Finally, we demonstrate that the Hebbian matrix, built on sparse examples, can be recovered as the fixed point of a gradient descent algorithm with dropout, over a suitable loss function.

Summary

We haven't generated a summary for this paper yet.