Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SynA-ResNet: Spike-driven ResNet Achieved through OR Residual Connection (2311.06570v3)

Published 11 Nov 2023 in cs.CV

Abstract: Spiking Neural Networks (SNNs) have garnered substantial attention in brain-like computing for their biological fidelity and the capacity to execute energy-efficient spike-driven operations. As the demand for heightened performance in SNNs surges, the trend towards training deeper networks becomes imperative, while residual learning stands as a pivotal method for training deep neural networks. In our investigation, we identified that the SEW-ResNet, a prominent representative of deep residual spiking neural networks, incorporates non-event-driven operations. To rectify this, we propose a novel training paradigm that first accumulates a large amount of redundant information through OR Residual Connection (ORRC), and then filters out the redundant information using the Synergistic Attention (SynA) module, which promotes feature extraction in the backbone while suppressing the influence of noise and useless features in the shortcuts. When integrating SynA into the network, we observed the phenomenon of "natural pruning", where after training, some or all of the shortcuts in the network naturally drop out without affecting the model's classification accuracy. This significantly reduces computational overhead and makes it more suitable for deployment on edge devices. Experimental results on various public datasets confirmed that the SynA-ResNet achieved single-sample classification with as little as 0.8 spikes per neuron. Moreover, when compared to other residual SNN models, it exhibited higher accuracy and up to a 28-fold reduction in energy consumption.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yimeng Shan (8 papers)
  2. Xuerui Qiu (16 papers)
  3. Haicheng Qu (5 papers)
  4. Rui-Jie Zhu (20 papers)
  5. Jason K. Eshraghian (33 papers)
  6. Malu Zhang (43 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.