Optimal ANN-SNN Conversion with Group Neurons
Abstract: Spiking Neural Networks (SNNs) have emerged as a promising third generation of neural networks, offering unique characteristics such as binary outputs, high sparsity, and biological plausibility. However, the lack of effective learning algorithms remains a challenge for SNNs. For instance, while converting artificial neural networks (ANNs) to SNNs circumvents the need for direct training of SNNs, it encounters issues related to conversion errors and high inference time delays. In order to reduce or even eliminate conversion errors while decreasing inference time-steps, we have introduced a novel type of neuron called Group Neurons (GNs). One GN is composed of multiple Integrate-and-Fire (IF) neurons as members, and its neural dynamics are meticulously designed. Based on GNs, we have optimized the traditional ANN-SNN conversion framework. Specifically, we replace the IF neurons in the SNNs obtained by the traditional conversion framework with GNs. The resulting SNNs, which utilize GNs, are capable of achieving accuracy levels comparable to ANNs even within extremely short inference time-steps. The experiments on CIFAR10, CIFAR100, and ImageNet datasets demonstrate the superiority of the proposed methods in terms of both inference accuracy and latency. Code is available at https://github.com/Lyu6PosHao/ANN2SNN_GN.
- Wolfgang Maass, “Networks of spiking neurons: the third generation of neural network models,” Neural networks, vol. 10, no. 9, pp. 1659–1671, 1997.
- Eugene M Izhikevich, “Simple model of spiking neurons,” IEEE Transactions on neural networks, vol. 14, no. 6, pp. 1569–1572, 2003.
- “A logical calculus of the ideas immanent in nervous activity,” The bulletin of mathematical biophysics, vol. 5, pp. 115–133, 1943.
- “TrueNorth: Accelerating from zero to 64 million neurons in 10 years,” Computer, vol. 52, no. 5, pp. 20–29, 2019.
- “Loihi: A neuromorphic manycore processor with on-chip learning,” IEEE Micro, vol. 38, no. 1, pp. 82–99, 2018.
- “Towards artificial general intelligence with hybrid tianjic chip architecture,” Nature, vol. 572, no. 7767, pp. 106–111, 2019.
- “Recent progress of neuromorphic computing based on silicon photonics: Electronic–photonic co-design, device, and architecture,” in Photonics. MDPI, 2022, vol. 9, p. 698.
- “Reducing latency in a converted spiking video segmentation network,” in 2021 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2021, pp. 1–5.
- “Spatio-temporal backpropagation for training high-performance spiking neural networks,” Frontiers in neuroscience, vol. 12, pp. 331, 2018.
- “Deep residual learning in spiking neural networks,” Advances in Neural Information Processing Systems, vol. 34, pp. 21056–21069, 2021.
- “Multi-level firing with spiking ds-resnet: Enabling better and deeper directly-trained spiking neural networks,” arXiv preprint arXiv:2210.06386, 2022.
- “Spiking deep convolutional neural networks for energy-efficient object recognition,” International Journal of Computer Vision, vol. 113, pp. 54–66, 2015.
- “Optimal ann-snn conversion for high-accuracy and ultra-low-latency spiking neural networks,” arXiv preprint arXiv:2303.04347, 2023.
- “Conversion of continuous-valued deep networks to efficient event-driven networks for image classification,” Frontiers in neuroscience, vol. 11, pp. 682, 2017.
- “Rmp-snn: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 13558–13567.
- “Optimal conversion of conventional artificial neural networks to spiking neural networks,” arXiv preprint arXiv:2103.00476, 2021.
- “Optimal ann-snn conversion for fast and accurate inference in deep spiking neural networks,” arXiv preprint arXiv:2105.11654, 2021.
- “Tcl: an ann-to-snn conversion with trainable clipping layers,” in 2021 58th ACM/IEEE Design Automation Conference (DAC). IEEE, 2021, pp. 793–798.
- Yang Li and Yi Zeng, “Efficient and accurate conversion of spiking neural network with burst spikes,” arXiv preprint arXiv:2204.13271, 2022.
- “Signed neuron with memory: Towards simple, accurate and high-efficient ann-snn conversion,” in International Joint Conference on Artificial Intelligence, 2022.
- “Optimized potential initialization for low-latency spiking neural networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, 2022, vol. 36, pp. 11–20.
- “Bridging the gap between anns and snns by calibrating offset spikes,” arXiv preprint arXiv:2302.10685, 2023.
- “Reducing ann-snn conversion error through residual membrane potential,” arXiv preprint arXiv:2302.02091, 2023.
- “A free lunch from ann: Towards efficient, accurate spiking neural networks calibration,” in International Conference on Machine Learning. PMLR, 2021, pp. 6316–6325.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.