Papers
Topics
Authors
Recent
Search
2000 character limit reached

Exponential capacity of associative memories under quantum annealing recall

Published 25 Feb 2016 in quant-ph and cs.OH | (1602.08149v1)

Abstract: Associative memory models, in theoretical neuro- and computer sciences, can generally store a sublinear number of memories. We show that using quantum annealing for recall tasks endows associative memory models with exponential storage capacities. Theoretically, we obtain the radius of attractor basins, $R(N)$, and the capacity, $C(N)$, of such a scheme and their tradeoffs. Our calculations establish that for randomly chosen memories the capacity of a model using the Hebbian learning rule with recall via quantum annealing is exponential in the size of the problem, $C(N)=\mathcal{O}(e{C_1N}),~C_1\geq0$, and succeeds on randomly chosen memory sets with a probability of $(1-e{-C_2N}),~C_2\geq0$ with $C_1+C_2=(.5-f)2/(1-f)$, where, $f=R(N)/N,~0\leq f\leq .5$ is the radius of attraction in terms of Hamming distance of an input probe from a stored memory as a fraction of the problem size. We demonstrate the application of this scheme on a programmable quantum annealing device - the Dwave processor.

Citations (13)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.