Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust Associative Memories Naturally Occuring From Recurrent Hebbian Networks Under Noise (1709.08367v1)

Published 25 Sep 2017 in cs.NE

Abstract: The brain is a noisy system subject to energy constraints. These facts are rarely taken into account when modelling artificial neural networks. In this paper, we are interested in demonstrating that those factors can actually lead to the appearance of robust associative memories. We first propose a simplified model of noise in the brain, taking into account synaptic noise and interference from neurons external to the network. When coarsely quantized, we show that this noise can be reduced to insertions and erasures. We take a neural network with recurrent modifiable connections, and subject it to noisy external inputs. We introduce an energy usage limitation principle in the network as well as consolidated Hebbian learning, resulting in an incremental processing of inputs. We show that the connections naturally formed correspond to state-of-the-art binary sparse associative memories.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Eliott Coyac (1 paper)
  2. Vincent Gripon (88 papers)
  3. Charlotte Langlais (4 papers)
  4. Claude Berrou (6 papers)

Summary

We haven't generated a summary for this paper yet.