Exploring Liquid Neural Networks on Loihi-2 (2407.20590v1)
Abstract: This study investigates the realm of liquid neural networks (LNNs) and their deployment on neuromorphic hardware platforms. It provides an in-depth analysis of Liquid State Machines (LSMs) and explores the adaptation of LNN architectures to neuromorphic systems, highlighting the theoretical foundations and practical applications. We introduce a pioneering approach to image classification on the CIFAR-10 dataset by implementing Liquid Neural Networks (LNNs) on state-of-the-art neuromorphic hardware platforms. Our Loihi-2 ASIC-based architecture demonstrates exceptional performance, achieving a remarkable accuracy of 91.3% while consuming only 213 microJoules per frame. These results underscore the substantial potential of LNNs for advancing neuromorphic computing and establish a new benchmark for the field in terms of both efficiency and accuracy.
- L. Xu, S. Klasa, and A. Yuille, “RECENT ADVANCES ON TECHNIQUES OF STATIC FEEDFORWARD NETWORKS WITH SUPERVISED LEARNING,” International Journal of Neural Systems, vol. 03, no. 03, pp. 253–290, 1 1992. [Online]. Available: https://doi.org/10.1142/s0129065792000218
- Y. Huang, “Advances in artificial Neural Networks – Methodological development and application,” Algorithms, vol. 2, no. 3, pp. 973–1007, 8 2009. [Online]. Available: https://doi.org/10.3390/algor2030973
- S. E. Dreyfus, “Artificial neural networks, back propagation, and the Kelley-Bryson gradient procedure,” Journal of Guidance Control and Dynamics, vol. 13, no. 5, pp. 926–928, 9 1990. [Online]. Available: https://doi.org/10.2514/3.25422
- J. Ciesinger, “Neural nets,” SIGACT news, vol. 19, no. 3-4, pp. 46–47, 11 1988. [Online]. Available: https://doi.org/10.1145/58395.58398
- M. Chahine, R. Hasani, P. S. S. Kao, A. Ray, R. Shubert, M. Lechner, A. Amini, and D. Rus, “Robust flight navigation out of distribution with liquid neural networks,” Science robotics, vol. 8, no. 77, 4 2023. [Online]. Available: https://doi.org/10.1126/scirobotics.adc8892
- R. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant networks,” Proceedings of the … AAAI Conference on Artificial Intelligence, vol. 35, no. 9, pp. 7657–7666, 5 2021. [Online]. Available: https://doi.org/10.1609/aaai.v35i9.16936
- M. Lechner, R. Hasani, M. Zimmer, T. A. Henzinger, and R. Grosu, “Designing Worm-inspired Neural Networks for Interpretable Robotic Control,” IEEE, 5 2019. [Online]. Available: https://doi.org/10.1109/icra.2019.8793840
- P. K. Huynh, M. L. Varshika, A. Paul, M. Isik, A. Balaji, and A. Das, “Implementing spiking neural networks on neuromorphic architectures: A review,” arXiv preprint arXiv:2202.08897, 2022.
- M. Isik, K. Inadagbo, and H. Aktas, “Design optimization for high-performance computing using fpga,” in Annual International Conference on Information Management and Big Data. Springer, 2023, pp. 142–156.
- M. Isik, H. Vishwamith, Y. Sur, K. Inadagbo, and I. C. Dikmen, “Neurosec: Fpga-based neuromorphic audio security,” in International Symposium on Applied Reconfigurable Computing. Springer, 2024, pp. 134–147.
- R. M. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant recurrent neural networks as universal approximators,” 11 2018. [Online]. Available: https://arxiv.org/abs/1811.00321v1
- M. Bidollahkhani, F. Atasoy, and H. Abdellatef, “LTC-SE: Expanding the potential of Liquid Time-Constant Neural Networks for scalable AI and embedded systems,” arXiv (Cornell University), 4 2023. [Online]. Available: https://arxiv.org/abs/2304.08691
- L. Lapicque, “Recherches quantitatives sur l’excitation electrique des nerfs traitée comme une polarization,” Journal de Physiologie et Pathologie General, vol. 9, pp. 620–635, 1907.
- A. L. Hodgkin and A. Huxley, “A quantitative description of membrane current and its application to conduction and excitation in nerve,” The Journal of Physiology, vol. 117, no. 4, pp. 500–544, 8 1952. [Online]. Available: https://doi.org/10.1113/jphysiol.1952.sp004764
- R. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant networks,” arXiv (Cornell University), 6 2020. [Online]. Available: https://arxiv.org/abs/2006.04439
- N. Saha, A. Swetapadma, and M. Mondal, “A Brief Review on Artificial Neural Network: Network Structures and Applications,” IEEE, 3 2023. [Online]. Available: https://doi.org/10.1109/icaccs57279.2023.10112753
- M. H. Nielsen, C.-Y. Yeh, M. Shen, and M. Médard, “Blockage prediction in directional MMWAve links using liquid time constant network,” IEEE, 9 2023. [Online]. Available: https://doi.org/10.1109/irmmw-thz57677.2023.10299092
- M. Y. Niu, L. Horesh, and I. Chuang, “Recurrent neural networks in the eye of differential equations,” 4 2019. [Online]. Available: https://arxiv.org/abs/1904.12933
- H. Shang and B. Hu, “CALNET: LiDAR-Camera Online calibration with channel attention and Liquid Time-Constant Network,” 2022 26th International Conference on Pattern Recognition (ICPR), 8 2022. [Online]. Available: https://doi.org/10.1109/icpr56361.2022.9956145
- H. Yin, Y. Zhou, L. Cao, and Y. Xu, “Channel Prediction with Liquid Time-Constant Networks: An Online and Adaptive Approach,” 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), 9 2021. [Online]. Available: https://doi.org/10.1109/vtc2021-fall52928.2021.9625323
- L. Nye, “Digital twins for patient care via knowledge graphs and Closed-Form Continuous-Time liquid neural networks,” arXiv (Cornell University), 7 2023. [Online]. Available: https://arxiv.org/abs/2307.04772
- M. Davies, A. Wild, G. Orchard, Y. Sandamirskaya, G. A. F. Guerra, P. Joshi, P. Plank, and S. R. Risbud, “Advancing neuromorphic computing with loihi: A survey of results and outlook,” Proceedings of the IEEE, vol. 109, no. 5, pp. 911–934, 2021.
- J. Wu and M. Gu, “Perfecting liquid-state theories with machine intelligence,” The Journal of Physical Chemistry Letters, vol. 14, pp. 10 545–10 552, 2023.
- L. Deckers, I. J. Tsang, W. Van Leekwijck, and S. Latré, “Extended liquid state machines for speech recognition,” Frontiers in Neuroscience, vol. 16, p. 1023470, 2022.
- H. Hazan and L. M. Manevitz, “Topological constraints and robustness in liquid state machines,” Expert systems with applications, vol. 39, no. 2, pp. 1597–1606, 2 2012. [Online]. Available: https://doi.org/10.1016/j.eswa.2011.06.052
- M. Koo, G. Srinivasan, Y. Shim, and K. Roy, “SBSNN: Stochastic-BITS enabled binary spiking neural network with On-Chip learning for energy efficient neuromorphic computing at the edge,” IEEE transactions on circuits and systems. I, Regular papers (Print), vol. 67, no. 8, pp. 2546–2555, 8 2020. [Online]. Available: https://doi.org/10.1109/tcsi.2020.2979826
- R. Zhao, W. Song, W. Zhang, T. Xing, J.-H. Lin, M. Srivastava, R. Gupta, and Z. Zhang, “Accelerating binarized convolutional neural networks with software-programmable fpgas,” in Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays, 2017, pp. 15–24.
- S. K. Esser, P. A. Merolla, J. V. Arthur, A. S. Cassidy, R. Appuswamy, A. Andreopoulos, D. J. Berg, J. L. McKinstry, T. Melano, D. R. Barch et al., “From the cover: Convolutional networks for fast, energy-efficient neuromorphic computing,” Proceedings of the National Academy of Sciences of the United States of America, vol. 113, no. 41, p. 11441, 2016.
- D. Bankman, L. Yang, B. Moons, M. Verhelst, and B. Murmann, “An always-on 3.8 microj/86
- B. Moons, D. Bankman, L. Yang, B. Murmann, and M. Verhelst, “Binareye: An always-on energy-accuracy-scalable binary cnn processor with all memory on chip in 28nm cmos,” in 2018 IEEE Custom Integrated Circuits Conference (CICC). IEEE, 2018, pp. 1–4.
- Y.-C. Hsu, A. Kosuge, R. Sumikawa, K. Shiba, M. Hamada, and T. Kuroda, “A fully synthesized 13.7 μ𝜇\muitalic_μj/prediction 88% accuracy cifar-10 single-chip data-reusing wired-logic processor using non-linear neural network,” in Proceedings of the 28th Asia and South Pacific Design Automation Conference, 2023, pp. 182–183.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.