Papers
Topics
Authors
Recent
2000 character limit reached

Correlation-invariant synaptic plasticity

Published 21 May 2021 in q-bio.NC | (2105.10109v2)

Abstract: Cortical populations of neurons develop sparse representations adapted to the statistics of the environment. While existing synaptic plasticity models reproduce some of the observed receptive-field properties, a major obstacle is the sensitivity of Hebbian learning to omnipresent spurious correlations in cortical networks which can overshadow relevant latent input features. Here we develop a theory for synaptic plasticity that is invariant to second-order correlations in the input. Going beyond classical Hebbian learning, we show how Hebbian long-term depression (LTD) cancels the sensitivity to second-order correlations, so that receptive fields become aligned with features hidden in higher-order statistics. Our simulations demonstrate how correlation-invariance enables biologically realistic models to develop sparse population codes, despite diverse levels of variability and heterogeneity. The theory advances our understanding of local unsupervised learning in cortical circuits and assigns a specific functional role to synaptic LTD mechanisms in pyramidal neurons.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.