Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unsupervised learning by a nonlinear network with Hebbian excitatory and anti-Hebbian inhibitory neurons (1812.11581v1)

Published 30 Dec 2018 in q-bio.NC and cs.NE

Abstract: This paper introduces a rate-based nonlinear neural network in which excitatory (E) neurons receive feedforward excitation from sensory (S) neurons, and inhibit each other through disynaptic pathways mediated by inhibitory (I) interneurons. Correlation-based plasticity of disynaptic inhibition serves to incompletely decorrelate E neuron activity, pushing the E neurons to learn distinct sensory features. The plasticity equations additionally contain "extra" terms fostering competition between excitatory synapses converging onto the same postsynaptic neuron and inhibitory synapses diverging from the same presynaptic neuron. The parameters of competition between S$\to$E connections can be adjusted to make learned features look more like "parts" or "wholes." The parameters of competition between I-E connections can be adjusted to set the typical decorrelatedness and sparsity of E neuron activity. Numerical simulations of unsupervised learning show that relatively few I neurons can be sufficient for achieving good decorrelation, and increasing the number of I neurons makes decorrelation more complete. Excitatory and inhibitory inputs to active E neurons are approximately balanced as a result of learning.

Citations (4)

Summary

We haven't generated a summary for this paper yet.