Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Coupled Neural Associative Memories (1301.1555v5)

Published 8 Jan 2013 in cs.NE, cs.IT, cs.LG, and math.IT

Abstract: We propose a novel architecture to design a neural associative memory that is capable of learning a large number of patterns and recalling them later in presence of noise. It is based on dividing the neurons into local clusters and parallel plains, very similar to the architecture of the visual cortex of macaque brain. The common features of our proposed architecture with those of spatially-coupled codes enable us to show that the performance of such networks in eliminating noise is drastically better than the previous approaches while maintaining the ability of learning an exponentially large number of patterns. Previous work either failed in providing good performance during the recall phase or in offering large pattern retrieval (storage) capacities. We also present computational experiments that lend additional support to the theoretical analysis.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Amin Karbasi (116 papers)
  2. Amir Hesam Salavati (6 papers)
  3. Amin Shokrollahi (13 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.