Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 73 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 185 tok/s Pro
GPT OSS 120B 441 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Hebbian Synaptic Plasticity and Bayesian Learning

Updated 1 October 2025
  • Synaptic plasticity with Hebbian learning is a mechanism where synapses adjust their strength via correlated activity and uncertainty, integrating Bayesian inference.
  • Adaptive learning rates arise from synaptic uncertainty, enabling rapid updates in low-activity scenarios and unifying classical Hebbian and delta-rule concepts.
  • Empirical and theoretical advances reveal that variability in post-synaptic potentials encodes uncertainty, linking error-driven signals and reinforcement feedback in neural adaptation.

Synaptic plasticity with Hebbian learning encompasses a set of activity-dependent mechanisms by which synapses adjust their efficacy, forming the basis for neural adaptation, memory, and information processing. The core Hebbian principle—often stated as "cells that fire together, wire together"—has been extended and formalized across diverse quantitative frameworks, ranging from classical correlation-based rules to Bayesian inference models. Recent advances integrate uncertainty estimation, competitive elimination, nonlinear correlations, and feedback from reward or global signals, providing a comprehensive and adaptive foundation for neural computation and learning.

1. Bayesian Hebbian Plasticity and Uncertainty Modulation

Modern formulations of synaptic plasticity interpret weight adaptation as an inference problem, where each synapse maintains a probability distribution over its target weight rather than only a point estimate. In the Bayesian synaptic plasticity framework (Aitchison et al., 2014), each synapse encodes both a mean mm (the log-target weight) and an uncertainty s2s^2 (posterior variance) through the posterior distribution

P(θtarD)N(m,s2)P(\theta_{\text{tar}}|D) \approx \mathcal{N}(m, s^2)

where θtar\theta_{\text{tar}} is the synapse's ideal log-weight, inferred from data DD.

The update to the mean is given by

Δmiλixif\Delta m_i \propto \lambda_i x_i f

where xix_i is presynaptic activity, ff is an error or postsynaptic signal, and the learning rate λisi2\lambda_i \propto s_i^2 adapts to each synapse’s local uncertainty. Higher uncertainty (i.e., larger si2s_i^2) results in more dramatic weight updates, making the learning rate itself a function of the synapse's confidence.

Bayesian inference for plasticity proceeds in two steps: incorporating new data via Bayes’ rule, and then accounting for drift in the underlying “true” weight. Approximations such as Assumed Density Filtering are used to provide closed-form update rules.

The Bayesian view generalizes classical Hebbian and delta-rule learning, recovering them as limiting cases when uncertainty is negligible and fixed, and formalizes the interpretation that synaptic learning rates should be high when information is sparse or ambiguous.

2. Adaptive Learning Rates and Plasticity–Uncertainty Feedback

The adaptive learning rate from the Bayesian framework leads to several experimentally falsifiable predictions. One is the direct relationship between presynaptic firing rate and synaptic uncertainty:

  • Synapses receiving lower presynaptic firing rates (i.e., lower ν\nu) retain larger uncertainties (s2s^2) and adjust their weights more per updating event. Specifically, relative weight changes scale as Δw/w1/ν\Delta w/w \propto 1/\nu.
  • The normalized variability of postsynaptic potentials (PSPs)—quantified as PSPvariance/PSPmeanPSP_\mathrm{variance}/PSP_\mathrm{mean}—is predicted to decrease with increasing presynaptic firing rate.

According to the "Synaptic Sampling" hypothesis (Aitchison et al., 2014), a synapse communicates its uncertainty via the trial-to-trial variability in PSPs: If PSPN(mi,si2)PSP \sim \mathcal{N}(m_i, s_i^2), large si2s_i^2 signals high uncertainty, modulating both learning rate and neural network stability.

In summary, the core update rules for mean and variance take the form

minew=miold+si2si2+sprior2xif+driftm_i^{\mathrm{new}} = m_i^{\mathrm{old}} + \frac{s_i^2}{s_i^2 + s_{\text{prior}}^2} x_i f + \text{drift}

Δsi2=update decrement+drift increment\Delta s_i^2 = \text{update decrement} + \text{drift increment}

thereby tightly coupling plasticity intensity to ongoing uncertainty estimates.

3. Generalization of Hebbian and Delta-rule Learning

Bayesian plasticity incorporates feedback signals of arbitrary type—linear, cerebellar, or reinforcement-based—into unified update rules. When uncertainty is small and relatively constant, the Bayesian update reduces to the standard delta rule

Δwi=ηxif\Delta w_i = \eta x_i f

for a fixed learning rate η\eta. The Bayesian framework, however, justifies and generalizes this rule, showing that for biological synapses facing sparse, noisy data, using a learning rate that scales with perceived uncertainty is optimal both for rapid learning and stability.

This principle extends to network motifs that rely on parallel feedback (e.g., cerebellar architectures) or linear error signals, establishing a normative theoretical basis for diverse synaptic learning rule implementations.

4. Variability in Post-synaptic Potential Size and Uncertainty Coding

The "Synaptic Sampling" proposal asserts that the size and variability of the PSPs are not merely noise but encode the synapse's uncertainty estimate. In practice: PSPmean=mi,PSPvariance=si2\mathrm{PSP}_\mathrm{mean} = m_i, \quad \mathrm{PSP}_\mathrm{variance} = s_i^2 Empirical analyses of in vivo synaptic recordings (e.g., cortical data) confirm that the normalized variability in PSPs decreases with rising presynaptic firing rate, as anticipated by the Bayesian framework. A key prediction is that observing high PSP variability in a synapse is evidence of its uncertainty regarding its optimal weight, thus functionally linking plasticity, variability, and learning rate.

5. Experimental and Theoretical Implications

The Bayesian synaptic plasticity theory yields a suite of testable predictions and explanatory advances:

  • Synapses with infrequent presynaptic input should display both higher learning rates and larger PSP variability.
  • Sampling-based communication of uncertainty provides a mechanistic explanation for the large variability in neurotransmitter release and PSP size observed in cortical and subcortical systems.
  • Bayesian update rules account for differences observed across diverse learning feedback regimes, suggesting that chronic recordings could reveal time-evolving “confidence” traces at synapses.

This perspective rationalizes the adaptability of biological synapses, where what superficially appears as synaptic “noise” is instead an essential variable for efficient learning.

6. Synthesis: Toward a Normative Theory of Synaptic Plasticity

By framing synaptic plasticity as a Bayesian inference problem, classical Hebbian learning is recast as a process where synapses estimate not only their strength but also the uncertainty in that estimate, using this uncertainty to modulate the pace and magnitude of learning. The framework predicts that synapses adaptively regulate their update rates in response to the local statistics of their inputs and past activity, accounting for both trial-to-trial variability and long-term learning stability.

This synthesis generalizes and connects previously distinct strands of learning theory, integrating error-driven, Hebbian, and reinforcement learning signals in a single probabilistically justified adaptive rule. This normative approach challenges and deepens traditional interpretations of synaptic plasticity, addressing how brain circuits can achieve both robustness and rapid adaptation when learning from ambiguous or noisy sensory inputs.

In summary, synaptic plasticity with Hebbian learning evolves beyond fixed-rate correlation rules toward adaptive, uncertainty-sensitive inference mechanisms. Such frameworks anticipate, explain, and unify a wide array of phenomenology—adaptive learning rates, variability, noise-coding, and feedback-response matching—that are critical in cortical and subcortical computation (Aitchison et al., 2014).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Synaptic Plasticity with Hebbian Learning.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube