Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Spherical perceptron as a storage memory with limited errors (1306.3809v1)

Published 17 Jun 2013 in math.PR, math-ph, math.MP, and stat.ML

Abstract: It has been known for a long time that the classical spherical perceptrons can be used as storage memories. Seminal work of Gardner, \cite{Gar88}, started an analytical study of perceptrons storage abilities. Many of the Gardner's predictions obtained through statistical mechanics tools have been rigorously justified. Among the most important ones are of course the storage capacities. The first rigorous confirmations were obtained in \cite{SchTir02,SchTir03} for the storage capacity of the so-called positive spherical perceptron. These were later reestablished in \cite{TalBook} and a bit more recently in \cite{StojnicGardGen13}. In this paper we consider a variant of the spherical perceptron that operates as a storage memory but allows for a certain fraction of errors. In Gardner's original work the statistical mechanics predictions in this directions were presented sa well. Here, through a mathematically rigorous analysis, we confirm that the Gardner's predictions in this direction are in fact provable upper bounds on the true values of the storage capacity. Moreover, we then present a mechanism that can be used to lower these bounds. Numerical results that we present indicate that the Garnder's storage capacity predictions may, in a fairly wide range of parameters, be not that far away from the true values.

Citations (14)

Summary

We haven't generated a summary for this paper yet.