Correlation-invariant synaptic plasticity
Abstract: Cortical populations of neurons develop sparse representations adapted to the statistics of the environment. While existing synaptic plasticity models reproduce some of the observed receptive-field properties, a major obstacle is the sensitivity of Hebbian learning to omnipresent spurious correlations in cortical networks which can overshadow relevant latent input features. Here we develop a theory for synaptic plasticity that is invariant to second-order correlations in the input. Going beyond classical Hebbian learning, we show how Hebbian long-term depression (LTD) cancels the sensitivity to second-order correlations, so that receptive fields become aligned with features hidden in higher-order statistics. Our simulations demonstrate how correlation-invariance enables biologically realistic models to develop sparse population codes, despite diverse levels of variability and heterogeneity. The theory advances our understanding of local unsupervised learning in cortical circuits and assigns a specific functional role to synaptic LTD mechanisms in pyramidal neurons.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.