Curriculum Design Helps Spiking Neural Networks to Classify Time Series (2401.10257v1)
Abstract: Spiking Neural Networks (SNNs) have a greater potential for modeling time series data than Artificial Neural Networks (ANNs), due to their inherent neuron dynamics and low energy consumption. However, it is difficult to demonstrate their superiority in classification accuracy, because current efforts mainly focus on designing better network structures. In this work, enlighten by brain-inspired science, we find that, not only the structure but also the learning process should be human-like. To achieve this, we investigate the power of Curriculum Learning (CL) on SNNs by designing a novel method named CSNN with two theoretically guaranteed mechanisms: The active-to-dormant training order makes the curriculum similar to that of human learning and suitable for spiking neurons; The value-based regional encoding makes the neuron activity to mimic the brain memory when learning sequential data. Experiments on multiple time series sources including simulated, sensor, motion, and healthcare demonstrate that CL has a more positive effect on SNNs than ANNs with about twice the accuracy change, and CSNN can increase about 3% SNNs' accuracy by improving network sparsity, neuron firing status, anti-noise ability, and convergence speed.
- Long short-term memory and learning-to-learn in networks of spiking neurons. In Advances in Neural Information Processing Systems (NeurIPS), volume 31, pp. 795–805, 2018.
- Superloss: A generic loss for robust curriculum learning. In Advances in Neural Information Processing Systems (NeurIPS), volume 33, 2020.
- State transition of dendritic spines improves learning of sparse spiking neural networks. In International Conference on Machine Learning (ICML), volume 162, pp. 3701–3715, 2022.
- The ucr time series archive. IEEE/CAA Journal of Automatica Sinica, 6(6):1293–1305, 2019. doi: 10.1109/JAS.2019.1911747.
- Backeisnn: A deep spiking neural network with adaptive self-feedback and balanced excitatory–inhibitory neurons. Neural Networks, 154:68–77, 2022. doi: 10.1016/j.neunet.2022.06.036.
- Multivariate time series classification using spiking neural networks. In International Joint Conference on Neural Networks (IJCNN), pp. 1–7, 2020. doi: 10.1109/IJCNN48605.2020.9206751.
- Deep learning for time series classification: a review. Data Mining and Knowledge Discovery, 33(4):917–963, 2019. doi: 10.1007/s10618-019-00619-1.
- The tempotron: a neuron that learns spike timing–based decisions. Nature Neuroscience, 9:420–428, 2006. doi: 10.1038/nn1643.
- On the power of curriculum learning in training deep networks. In International Conference on Machine Learning (ICML), volume 97, pp. 2535–2544, 2019.
- Approximation theory of convolutional architectures for time series modelling. In Meila, M. and Zhang, T. (eds.), International Conference on Machine Learning (ICML), volume 139, pp. 4961–4970, 2021.
- Temporal backpropagation for spiking neural networks with one spike per neuron. International Journal Of Neural Systems, 30(6):2050027, 2020. doi: 10.1142/S0129065720500276.
- CUP: curriculum learning based prompt tuning for implicit event argument extraction. In International Joint Conference on Artificial Intelligence (IJCAI), pp. 4245–4251, 2022. doi: 10.24963/ijcai.2022/589.
- Early prediction of sepsis from clinical data: the physionet/computing in cardiology challenge 2019. In Computing in Cardiology Conference (CinC), pp. 1–4, 2019. doi: 10.22489/CinC.2019.412.
- Time to treatment and mortality during mandated emergency care for sepsis. New England Journal of Medicine, 376(23):2235–2244, 2017. doi: 10.1056/NEJMOA1703058.
- A review of deep learning methods for irregularly sampled medical time series data. CoRR, abs/2010.12493, 2020a.
- Predicting covid-19 disease progression and patient outcomes based on temporal deep learning. BMC Medical Informatics and Decision Making, 21:45, 2020b. doi: 10.1186/s12911-020-01359-9.
- Te-esn: Time encoding echo state network for prediction based on irregularly sampled time series data. In International Joint Conference on Artificial Intelligence (IJCAI), pp. 3010–3016, 2021. doi: 10.24963/ijcai.2021/414.
- Confidence-guided learning process for continuous classification of time series. In International Conference on Information & Knowledge Management (CIKM), pp. 4525–4529, 2022. doi: 10.1145/3511808.3557565.
- Excitation-inhibition balanced spiking neural networks for fast information processing. In IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 249–252, 2019. doi: 10.1109/SMC.2019.8914566.
- On approximate renewal models for the superposition of renewal processes. In IEEE International Conference on Communications (ICC), pp. 2901–2906, 2001. doi: 10.1109/ICC.2001.936680.
- Geometry of sequence working memory in macaque prefrontal cortex. Science, 375(6581):632–639, 2022. doi: 10.1126/science.abm0204.
- An interpretable mortality prediction model for covid-19 patients. Nature Machine Intelligence, 2:283–288, 2020. doi: 10.1038/s42256-020-0180-7.
- Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nature Machine Intelligence, 3:905–913, 2021. doi: 10.1101/2021.03.22.436372.
- A transformer-based framework for multivariate time series representation learning. In ACM SIGKDD Conference on Knowledge Discovery & Data Mining (KDD), pp. 2114–2124, 2021. doi: 10.1145/3447548.3467401.
- Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks. Science Advances, 7(43):eabh0146, 2021. doi: 10.1126/sciadv.abh0146.
- Do RNN and LSTM have long memory? In International Conference on Machine Learning (ICML), volume 119, pp. 11365–11375, 2020.