Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Ultra-low-power Image Classification on Neuromorphic Hardware (2309.16795v2)

Published 28 Sep 2023 in cs.CV

Abstract: Spiking neural networks (SNNs) promise ultra-low-power applications by exploiting temporal and spatial sparsity. The number of binary activations, called spikes, is proportional to the power consumed when executed on neuromorphic hardware. Training such SNNs using backpropagation through time for vision tasks that rely mainly on spatial features is computationally costly. Training a stateless artificial neural network (ANN) to then convert the weights to an SNN is a straightforward alternative when it comes to image recognition datasets. Most conversion methods rely on rate coding in the SNN to represent ANN activation, which uses enormous amounts of spikes and, therefore, energy to encode information. Recently, temporal conversion methods have shown promising results requiring significantly fewer spikes per neuron, but sometimes complex neuron models. We propose a temporal ANN-to-SNN conversion method, which we call Quartz, that is based on the time to first spike (TTFS). Quartz achieves high classification accuracy and can be easily implemented on neuromorphic hardware while using the least amount of synaptic operations and memory accesses. It incurs a cost of two additional synapses per neuron compared to previous temporal conversion methods, which are readily available on neuromorphic hardware. We benchmark Quartz on MNIST, CIFAR10, and ImageNet in simulation to show the benefits of our method and follow up with an implementation on Loihi, a neuromorphic chip by Intel. We provide evidence that temporal coding has advantages in terms of power consumption, throughput, and latency for similar classification accuracy. Our code and models are publicly available.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (35)
  1. Neuromorphic computing gets ready for the (really) big time (2018). Accessed: 2020-09-25.
  2. Diehl, P. U. et al. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In 2015 International Joint Conference on Neural Networks (IJCNN), 1–8 (ieee, 2015).
  3. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. \JournalTitleFrontiers in neuroscience 11, 682 (2017).
  4. Spiking deep residual network. \JournalTitlearXiv preprint arXiv:1805.01352 (2018).
  5. Going deeper in spiking neural networks: Vgg and residual architectures. \JournalTitleFrontiers in neuroscience 13, 95 (2019).
  6. Rueckauer, B. et al. Nxtf: An api and compiler for deep spiking neural networks on intel loihi. \JournalTitlearXiv preprint arXiv:2101.04261 (2021).
  7. Spikeconverter: An efficient conversion framework zipping the gap between artificial neural networks and spiking neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence (2022).
  8. Bu, T. et al. Optimal ann-snn conversion for high-accuracy and ultra-low-latency spiking neural networks. In International Conference on Learning Representations (2021).
  9. An efficient spiking neural network for recognizing gestures with a dvs camera on the loihi neuromorphic processor. In 2020 International Joint Conference on Neural Networks, IJCNN 2020, 9207109 (Institute of Electrical and Electronics Engineers Inc., 2020).
  10. Theory and tools for the conversion of analog to spiking convolutional neural networks. \JournalTitlearXiv preprint arXiv:1612.04052 (2016).
  11. Comparison of artificial and spiking neural networks on digital hardware. \JournalTitleFrontiers in Neuroscience 15, 651141 (2021).
  12. Tdsnn: From deep neural networks to deep spike neural networks with temporal-coding. In Proceedings of the AAAI conference on artificial intelligence, vol. 33, 1319–1326 (2019).
  13. Deep spiking neural network: Energy efficiency through time based coding. In European Conference on Computer Vision (2020).
  14. Chu, K. T. N. et al. You only spike once: Improving energy-efficient neuromorphic inference to ann-level accuracy. \JournalTitlearXiv preprint arXiv:2006.09982 (2020).
  15. T2fsnn: Deep spiking neural networks with time-to-first-spike coding. \JournalTitlearXiv preprint arXiv:2003.11741 (2020).
  16. Conversion of analog to spiking neural networks using sparse temporal coding. In 2018 IEEE International Symposium on Circuits and Systems (ISCAS), 1–5 (IEEE, 2018).
  17. Optimized spiking neurons can classify images with high accuracy through temporal coding with two spikes. \JournalTitleNature Machine Intelligence 3, 230–238 (2021).
  18. Temporal pattern coding in deep spiking neural networks. In 2021 International Joint Conference on Neural Networks (IJCNN), 1–8 (IEEE, 2021).
  19. Recognizing images with at most one spike per neuron. \JournalTitlearXiv preprint arXiv:2001.01682 (2019).
  20. Classifying images with few spikes per neuron. \JournalTitlearXiv preprint arXiv:2002.00860 (2020).
  21. LeCun, Y. The mnist database of handwritten digits. \JournalTitlehttp://yann. lecun. com/exdb/mnist/ (1998).
  22. Learning multiple layers of features from tiny images. \JournalTitlen/a (2009).
  23. Imagenet classification with deep convolutional neural networks. \JournalTitleCommunications of the ACM 60, 84–90 (2017).
  24. Efficient computation in adaptive artificial spiking neural networks. \JournalTitlearXiv preprint arXiv:1710.04838 (2017).
  25. Deep neural networks with weighted spikes. \JournalTitleNeurocomputing 311, 373–386 (2018).
  26. Fast and efficient information transmission with burst spikes in deep spiking neural networks. In 2019 56th ACM/IEEE Design Automation Conference (DAC), 1–6 (IEEE, 2019).
  27. facebookresearch. fvcore, https://github.com/facebookresearch/fvcore/tree/main, version 0.1.6 (2023).
  28. Davies, M. et al. Loihi: A neuromorphic manycore processor with on-chip learning. \JournalTitleIEEE Micro 38, 82–99 (2018).
  29. Yin, S. et al. Algorithm and hardware design of discrete-time spiking neural networks based on back propagation with binary activations. In 2017 IEEE Biomedical Circuits and Systems Conference (BioCAS), 1–5 (IEEE, 2017).
  30. Fast classification using sparsely active spiking networks. In 2017 IEEE International Symposium on Circuits and Systems (ISCAS), 1–4 (IEEE, 2017).
  31. A low-power hardware architecture for on-line supervised learning in multi-layer spiking neural networks. In 2018 IEEE International Symposium on Circuits and Systems (ISCAS), 1–5 (IEEE, 2018).
  32. A 4096-neuron 1m-synapse 3.8-pj/sop spiking neural network with on-chip stdp learning and sparse weights in 10-nm finfet cmos. \JournalTitleIEEE Journal of Solid-State Circuits 54, 992–1002 (2018).
  33. Oh, S. et al. Hardware implementation of spiking neural networks using time-to-first-spike encoding. \JournalTitlearXiv preprint arXiv:2006.05033 (2020).
  34. Davies, M. Benchmarks for progress in neuromorphic computing. \JournalTitleNature Machine Intelligence 1, 386–388 (2019).
  35. D’Angelo, G. et al. Event-based eccentric motion detection exploiting time difference encoding. \JournalTitleFrontiers in Neuroscience 14, 451 (2020).

Summary

We haven't generated a summary for this paper yet.