Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Lossless ANN-SNN Conversion under Ultra-Low Latency with Dual-Phase Optimization (2205.07473v3)

Published 16 May 2022 in cs.NE and cs.LG

Abstract: Spiking neural networks (SNNs) operating with asynchronous discrete events show higher energy efficiency with sparse computation. A popular approach for implementing deep SNNs is ANN-SNN conversion combining both efficient training of ANNs and efficient inference of SNNs. However, the accuracy loss is usually non-negligible, especially under a few time steps, which restricts the applications of SNN on latency-sensitive edge devices greatly. In this paper, we first identify that such performance degradation stems from the misrepresentation of the negative or overflow residual membrane potential in SNNs. Inspired by this, we decompose the conversion error into three parts: quantization error, clipping error, and residual membrane potential representation error. With such insights, we propose a two-stage conversion algorithm to minimize those errors respectively. Besides, We show each stage achieves significant performance gains in a complementary manner. By evaluating on challenging datasets including CIFAR-10, CIFAR- 100 and ImageNet, the proposed method demonstrates the state-of-the-art performance in terms of accuracy, latency and energy preservation. Furthermore, our method is evaluated using a more challenging object detection task, revealing notable gains in regression performance under ultra-low latency when compared to existing spike-based detection algorithms. Codes are available at https://github.com/Windere/snn-cvt-dual-phase.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Ziming Wang (59 papers)
  2. Shuang Lian (1 paper)
  3. Yuhao Zhang (107 papers)
  4. Xiaoxin Cui (4 papers)
  5. Rui Yan (250 papers)
  6. Huajin Tang (44 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.