Exponential capacity of associative memories under quantum annealing recall (1602.08149v1)
Abstract: Associative memory models, in theoretical neuro- and computer sciences, can generally store a sublinear number of memories. We show that using quantum annealing for recall tasks endows associative memory models with exponential storage capacities. Theoretically, we obtain the radius of attractor basins, $R(N)$, and the capacity, $C(N)$, of such a scheme and their tradeoffs. Our calculations establish that for randomly chosen memories the capacity of a model using the Hebbian learning rule with recall via quantum annealing is exponential in the size of the problem, $C(N)=\mathcal{O}(e{C_1N}),~C_1\geq0$, and succeeds on randomly chosen memory sets with a probability of $(1-e{-C_2N}),~C_2\geq0$ with $C_1+C_2=(.5-f)2/(1-f)$, where, $f=R(N)/N,~0\leq f\leq .5$ is the radius of attraction in terms of Hamming distance of an input probe from a stored memory as a fraction of the problem size. We demonstrate the application of this scheme on a programmable quantum annealing device - the Dwave processor.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.