A Methodology to Study the Impact of Spiking Neural Network Parameters considering Event-Based Automotive Data (2404.03493v3)
Abstract: Autonomous Driving (AD) systems are considered as the future of human mobility and transportation. Solving computer vision tasks such as image classification and object detection/segmentation, with high accuracy and low power/energy consumption, is highly needed to realize AD systems in real life. These requirements can potentially be satisfied by Spiking Neural Networks (SNNs). However, the state-of-the-art works in SNN-based AD systems still focus on proposing network models that can achieve high accuracy, and they have not systematically studied the roles of SNN parameters when used for learning event-based automotive data. Therefore, we still lack understanding of how to effectively develop SNN models for AD systems. Toward this, we propose a novel methodology to systematically study and analyze the impact of SNN parameters considering event-based automotive data, then leverage this analysis for enhancing SNN developments. To do this, we first explore different settings of SNN parameters that directly affect the learning mechanism (i.e., batch size, learning rate, neuron threshold potential, and weight decay), then analyze the accuracy results. Afterward, we propose techniques that jointly improve SNN accuracy and reduce training time. Experimental results show that our methodology can improve the SNN models for AD systems than the state-of-the-art, as it achieves higher accuracy (i.e., 86%) for the NCARS dataset, and it can also achieve iso-accuracy (i.e., ~85% with standard deviation less than 0.5%) while speeding up the training time by 1.9x. In this manner, our research work provides a set of guidelines for SNN parameter enhancements, thereby enabling the practical developments of SNN-based AD systems.
- A. Viale et al., “Carsnn: An efficient spiking neural network for event-based autonomous cars on the loihi neuromorphic research processor,” in Int. Joint Conf. on Neural Networks (IJCNN), 2021, pp. 1–10.
- L. Cordone, B. Miramond, and P. Thierion, “Object detection with spiking neural networks on automotive event data,” in Int. Joint Conf. on Neural Networks (IJCNN), July 2022, pp. 1–8.
- Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, p. 436, 2015.
- M. Shafique et al., “Towards energy-efficient and secure edge ai: A cross-layer framework iccad special session paper,” in IEEE/ACM Int. Conf. On Computer Aided Design (ICCAD), 2021, pp. 1–9.
- R. V. W. Putra and M. Shafique, “Mantis: Enabling energy-efficient autonomous mobile agents with spiking neural networks,” in 2023 9th Int. Conf. on Automation, Robotics and Applications (ICARA), 2023, pp. 197–201.
- A. Tavanaei et al., “Deep learning in spiking neural networks,” Neural Networks, vol. 111, pp. 47–63, 2019.
- R. V. W. Putra and M. Shafique, “Fspinn: An optimization framework for memory-efficient and energy-efficient spiking neural networks,” IEEE Trans. on Computer-Aided Design of Integrated Circuits and Systems (TCAD), vol. 39, no. 11, pp. 3601–3613, 2020.
- A. Viale et al., “Lanesnns: Spiking neural networks for lane detection on the loihi neuromorphic processor,” in 2022 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS). IEEE, 2022, pp. 79–86.
- R. V. W. Putra and M. Shafique, “Spikedyn: A framework for energy-efficient spiking neural networks with continual and unsupervised learning capabilities in dynamic environments,” in 58th ACM/IEEE Design Automation Conference (DAC), 2021, pp. 1057–1062.
- W. Guo et al., “Neural coding in spiking neural networks: A comparative study for robust neuromorphic systems,” Frontiers in Neuroscience (FNINS), vol. 15, 2021.
- S. Park et al., “T2fsnn: Deep spiking neural networks with time-to-first-spike coding,” in 57th ACM/IEEE Design Automation Conference (DAC). IEEE, 2020, pp. 1–6.
- S. S. Chowdhury, N. Rathi, and K. Roy, “Towards ultra low latency spiking neural networks for vision and sequential tasks using temporal pruning,” in European Conf. on Computer Vision (ECCV). Springer, 2022, pp. 709–726.
- R. V. W. Putra and M. Shafique, “Topspark: A timestep optimization methodology for energy-efficient spiking neural networks on autonomous mobile agents,” in 2023 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2023, pp. 3561–3567.
- A. Sironi et al., “Hats: Histograms of averaged time surfaces for robust event-based object classification,” in IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2018, pp. 1731–1740.
- R. V. W. Putra et al., “Embodied neuromorphic artificial intelligence for robotics: Perspectives, challenges, and research development stack,” arXiv preprint, 2024.
- B. Rückauer et al., “Closing the accuracy gap in an event-based visual recognition task,” CoRR, vol. abs/1906.08859, 2019.
- R. Massa et al., “An efficient spiking neural network for recognizing gestures with a DVS camera on the loihi neuromorphic processor,” in Int. Joint Conf. on Neural Networks (IJCNN). IEEE, 2020, pp. 1–9.
- E. O. Neftci, H. Mostafa, and F. Zenke, “Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks,” IEEE Signal Processing Magazine, vol. 36, no. 6, pp. 51–63, 2019.
- Y. Wu et al., “Spatio-temporal backpropagation for training high-performance spiking neural networks,” Frontiers in Neuroscience (FNINS), vol. 12, p. 331, 2018.
- R. V. W. Putra and M. Shafique, “lpspikecon: Enabling low-precision spiking neural network processing for efficient unsupervised continual learning on autonomous agents,” in Int. Joint Conf. on Neural Networks (IJCNN), 2022, pp. 1–8.
- D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” in Int. Conf. on Learning Representations (ICLR), 2015.
- S. Koppula et al., “Eden: Enabling energy-efficient, high-performance deep neural network inference using approximate dram,” in 52nd Annual IEEE/ACM Int. Symp. on Microarchitecture (MICRO), 2019, p. 166–181.
- R. V. W. Putra, M. A. Hanif, and M. Shafique, “Sparkxd: A framework for resilient and energy-efficient spiking neural network inference using approximate dram,” in 58th ACM/IEEE Design Automation Conference (DAC), 2021, pp. 379–384.