Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network (2203.16117v2)

Published 30 Mar 2022 in cs.CV and cs.NE

Abstract: Spiking Neural Networks (SNNs) have piqued researchers' interest because of their capacity to process temporal information and low power consumption. However, current state-of-the-art methods limited their biological plausibility and performance because their neurons are generally built on the simple Leaky-Integrate-and-Fire (LIF) model. Due to the high level of dynamic complexity, modern neuron models have seldom been implemented in SNN practice. In this study, we adopt the Phase Plane Analysis (PPA) technique, a technique often utilized in neurodynamics field, to integrate a recent neuron model, namely, the Izhikevich neuron. Based on the findings in the advancement of neuroscience, the Izhikevich neuron model can be biologically plausible while maintaining comparable computational cost with LIF neurons. By utilizing the adopted PPA, we have accomplished putting neurons built with the modified Izhikevich model into SNN practice, dubbed as the Standardized Izhikevich Tonic (SIT) neuron. For performance, we evaluate the suggested technique for image classification tasks in self-built LIF-and-SIT-consisted SNNs, named Hybrid Neural Network (HNN) on static MNIST, Fashion-MNIST, CIFAR-10 datasets and neuromorphic N-MNIST, CIFAR10-DVS, and DVS128 Gesture datasets. The experimental results indicate that the suggested method achieves comparable accuracy while exhibiting more biologically realistic behaviors on nearly all test datasets, demonstrating the efficiency of this novel strategy in bridging the gap between neurodynamics and SNN practice.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Cheng Jin (76 papers)
  2. Rui-Jie Zhu (20 papers)
  3. Xiao Wu (55 papers)
  4. Liang-jian Deng (32 papers)
Citations (4)