Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 189 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 36 tok/s Pro
GPT-4o 60 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Variable Spiking Neurons

Updated 11 November 2025
  • Variable Spiking Neurons (VSNs) are hybrid neural models that combine discrete spiking events with continuous, differentiable outputs for precise regression and classification.
  • They employ leaky integration and threshold-triggered gating mechanisms to achieve event-driven sparsity while maintaining rich signal processing.
  • Empirical benchmarks demonstrate that VSNs outperform traditional spiking neurons in accuracy and energy efficiency across applications like image classification and time-series regression.

Variable Spiking Neurons (VSNs) are a class of hybrid neural models and neuromorphic devices that combine the sparse, event-driven dynamics characteristic of biological spiking neurons with the information-rich and differentiable outputs of artificial neuron activations. Emerging from both theoretical modeling and device-level engineering, VSNs address the trade-off between energy efficiency and computational expressivity, enabling precise regression as well as classification, and providing a framework for both mathematical analysis and hardware implementation. They are characterized by mechanisms that produce spike events (discrete emissions when a membrane-like potential crosses a threshold) whose amplitude or information content is variable rather than strictly binary, and whose dynamics may depend on history, device physics, or explicit memory of variable length.

1. Hybrid Computational Model: Definition and Mathematical Structure

VSNs incorporate the intermittent firing of leaky integrate-and-fire (LIF) spiking neurons while introducing graded, continuous-valued outputs via a gating mechanism. Instead of binary 0/1 spikes, VSNs emit a continuous response conditioned on spiking events, which enables compatibility with both event-driven architectures and gradient-based optimizers. The generic VSN formulation may be expressed as follows for a discrete spike time-step tt (Garg et al., 2023, Jain et al., 12 Dec 2024):

  • Membrane integration with leak:

M(t)=βM(t1)+z(t)M^{(t)} = \beta M^{(t-1)} + z^{(t)}

where M(t)M^{(t)} is the membrane potential, β[0,1]\beta\in[0,1] is the leakage factor, and z(t)z^{(t)} is the weighted sum input.

  • Spike generation with reset:

y~(t)={1if M(t)T 0otherwiseand if y~(t)=1,M(t)0\tilde y^{(t)} = \begin{cases} 1 & \text{if}\ M^{(t)}\geq T \ 0 & \text{otherwise} \end{cases} \quad\text{and if}\ \tilde y^{(t)}=1,\, M^{(t)}\gets 0

where TT is the firing threshold.

  • Continuous activation gated by spike:

y(t)=σ(z(t)y~(t)),σ(0)=0y^{(t)} = \sigma(z^{(t)} \cdot \tilde y^{(t)}),\quad \sigma(0) = 0

where σ()\sigma(\cdot) is a smooth activation function (e.g., ReLU, GELU).

This architecture enables event-driven sparsity (outputs only when threshold is crossed) while preserving the rich signal processing and regression capacity of continuous-valued artificial neurons.

Variable Description
z(t)z^{(t)} Input to the neuron at spike step tt
M(t)M^{(t)} Membrane potential at time tt
y~(t)\tilde y^{(t)} Binary spike indicator
y(t)y^{(t)} Gated continuous output at tt
β\beta Leakage parameter (0β10\leq\beta\leq1)
TT Spike threshold
σ()\sigma(\cdot) Continuous activation with σ(0)=0\sigma(0)=0

In contrast to pure LIF units which produce only 0/1 spikes, the VSN transmits information proportional to the instantaneous input, conditioned on a thresholded firing event (Garg et al., 2023, Jain et al., 12 Dec 2024).

2. Memory and Stochastic Processes with Variable Length

A variant of VSNs emerges in the literature as stochastic processes with explicit variable-memory dependency (Galves et al., 2015). In these models, each neuron's spike probability or instantaneous firing rate at time tt is determined by the accumulated, weighted, and possibly decayed synaptic input received since its last spike, not a fixed-length window. Specifically:

  • For discrete time: The probability of neuron ii spiking at time tt is

P(Xt(i)=1Ft1)=ϕi(jIWjis=Ltit1gj(ts)Xs(j))P(X_t(i)=1 \mid \mathcal{F}_{t-1}) = \phi_i \left( \sum_{j\in I} W_{j\to i} \sum_{s=L^i_t}^{t-1} g_j(t-s)\, X_s(j) \right)

where LtiL^i_t is the last time s<ts<t when neuron ii spiked, gj()g_j(\cdot) is the leak kernel, WjiW_{j\to i} are synaptic weights, and ϕi\phi_i is a transfer function.

  • For continuous time: The intensity of spikes is

λti=Miϕi(jIWji[Lti,t)gj(ts)dZsj)\lambda^i_t = M_i \cdot \phi_i \left( \sum_{j\in I} W_{j\to i} \int_{[L^i_t, t)} g_j(t-s)dZ^j_s \right)

Here, the memory window (Lti,t)(L^i_t, t) varies stochastically, yielding a non-Markovian process. This structure enables the reproduction of realistic interspike interval statistics and “reset-and-forget” dynamics essential for biological plausibility (Galves et al., 2015).

3. Device-Level Realization in Neuromorphic Hardware

Variable spiking has been realized in a range of physical devices, notably spintronic and electromechanical elements. Examples include:

  • Spintronic Variable Spiking Devices: Dual-free-layer perpendicular magnetic tunnel junctions (MTJs) are engineered to generate voltage-controlled spike trains. Firing is triggered when the magnetization configuration cycles between parallel and antiparallel states under DC bias. The output frequency and energy per spike are tunable via the applied voltage. The features include field-free operation, sub-100nm footprint, energy consumption of 4–16 pJ/spike, and device-level non-volatility (Farcis et al., 2023).
  • Leaky Memcapacitive VSN Device: Electromechanical neurons based on a movable-plate leaky memcapacitor combine a state-dependent capacitance and resistance controlled by nanoscale mechanical displacement. The integration of memcapacitive (energy storage) and memristive (leaky) branches enables rich spiking dynamics, including tonic, phasic, and bursting regimes, which are analytically tracked via phase-plane bifurcations. Spike frequency, amplitude, and variability are tuned by device geometry and drive parameters (Zhang et al., 2023).

Such devices allow for analog control of spike shape, frequency, and event amplitude, tightly integrating computing and memory.

4. Training and Surrogate Gradient Methods

VSNs in deep learning networks are generally trained with backpropagation and surrogate gradients to handle the non-differentiable spike thresholding. The derivative of the step function y~(t)\tilde y^{(t)} with respect to the membrane potential MM is replaced by a smooth fast sigmoid:

s(M)=1[1+exp(α(MT))]2s'(M) = \frac{1}{\left[1+\exp(-\alpha(M-T))\right]^2}

where α\alpha is typically set to 25 for sharpness (Garg et al., 2023).

  • Loss Functions: Cross-entropy for classification, mean squared error (MSE) or normalized MSE for regression.
  • Optimizers and Hyperparameters: ADAM optimizer with weight decay 1×1041\times 10^{-4}; learning rates 10310^{-3} for AN/VSN, 10410^{-4} for SNN; batch sizes 200–1000; epochs 500–1000; STS (spike time steps) 1 for VSN (continuous precision), 50–100 for SNN (binary encoding) (Garg et al., 2023, Jain et al., 12 Dec 2024).

VSNs are often interleaved with standard message-passing operations in hybrid graph neural network architectures (HVS-GNN), where VSNs replace standard nonlinearities to promote event-driven sparsity while retaining regression performance (Jain et al., 12 Dec 2024).

5. Empirical Performance and Energy Efficiency

Benchmarks illustrate VSNs' efficacy in both supervised learning and energy-sensitive contexts:

  • Classification (MNIST): VSN-1 achieves 98.05% ±\pm 0.12 accuracy with average spike rates of 12–23% in hidden layers, matching artificial neurons (98.03% ±\pm 0.11) while reducing synaptic energy to 81% of dense activations (Garg et al., 2023).
  • Regression (Feynman datasets): VSNs with linear activation outperform both pure ANNs and LIF-SNNs in solving high-precision regression, at 10–39% average spike rates and 11–19% of ANN synaptic energy.
  • Graph Regression Tasks: In materials simulation and stress prediction, HVS-GNNs with VSNs achieve mean squared errors within 3–10% of standard GNNs, with event sparsity of 18–59%, corresponding to up to 82% reduction in synaptic operations (Jain et al., 12 Dec 2024).

Binary LIF spiking neurons configured for similar sparsity levels consistently underperform VSNs on regression accuracy, underlining the utility of continuous, variable amplitude spikes.

6. Theoretical Properties, Limit Dynamics, and Mathematical Foundations

VSNs modeled as interacting processes with memory of variable length admit explicit mathematical analysis:

  • Stationary Solutions and Perfect Simulation: Under boundedness and Lipschitz assumptions on weights and activation functions, VSN networks admit unique stationary solutions, with polynomial loss of memory quantified as O(1/s)O(1/s) decay.
  • Decorrelation: Successive interspike intervals rapidly decorrelate as network size increases—consistent with neuronal recordings. For appropriately random connectivity, the covariance between consecutive ISIs vanishes as NN\to\infty.
  • Mean-Field Propagation of Chaos: With appropriate scaling, the empirical law of membrane potentials converges to a deterministic trajectory governed by a nonlocal PDE. Individual neurons become asymptotically independent, each solving a nonlinear McKean–Vlasov equation (Galves et al., 2015).
  • Invariant Measures: Explicit forms for mixed populations, extinction thresholds, and phase transitions (e.g., between quiescent and active states under varying leak or excitation parameters) are available.

The non-Markovian structure allows for statistical properties and dynamic transitions not possible in fixed-memory LIF or Hawkes processes and admits new avenues for perfect simulation algorithms and analysis of real biological spike data.

7. Application Domains and Implementation Considerations

VSNs are suited to domains requiring precise, energy-efficient, event-driven computation with support for regression capabilities:

  • Scientific and Mathematical Modeling: Physics-informed neural operators, computational mechanics, and time-series regression tasks.
  • Edge-AI & Embedded Sensors: Robotics, wearable medical devices, and Internet-of-Things nodes where the energy and communication budget is critical (Garg et al., 2023, Jain et al., 12 Dec 2024).
  • Neuromorphic Hardware: Spintronic VSNs, leaky memcapacitive elements, and CMOS-compatible devices enable dense integration and low-power operation (Farcis et al., 2023, Zhang et al., 2023).
  • Statistical Simulation: Non-Markovian spike-train simulation, inference of synaptic connectivity graphs, and studies of population synchrony (Galves et al., 2015).

VSNs furnish a flexible framework, combining rich dynamical repertoire (graded, phasic, and bursting activity), mathematical tractability, and compatibility with both classical ANNs and emerging neuromorphic devices. A plausible implication is that further development in adaptive thresholds and device-level tunability will extend their capability for online learning and real-time edge inference.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Variable Spiking Neurons (VSNs).