Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 167 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 42 tok/s Pro
GPT-4o 97 tok/s Pro
Kimi K2 203 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 32 tok/s Pro
2000 character limit reached

Exploring Liquid Neural Networks on Loihi-2 (2407.20590v1)

Published 30 Jul 2024 in cs.ET and cs.AR

Abstract: This study investigates the realm of liquid neural networks (LNNs) and their deployment on neuromorphic hardware platforms. It provides an in-depth analysis of Liquid State Machines (LSMs) and explores the adaptation of LNN architectures to neuromorphic systems, highlighting the theoretical foundations and practical applications. We introduce a pioneering approach to image classification on the CIFAR-10 dataset by implementing Liquid Neural Networks (LNNs) on state-of-the-art neuromorphic hardware platforms. Our Loihi-2 ASIC-based architecture demonstrates exceptional performance, achieving a remarkable accuracy of 91.3% while consuming only 213 microJoules per frame. These results underscore the substantial potential of LNNs for advancing neuromorphic computing and establish a new benchmark for the field in terms of both efficiency and accuracy.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. L. Xu, S. Klasa, and A. Yuille, “RECENT ADVANCES ON TECHNIQUES OF STATIC FEEDFORWARD NETWORKS WITH SUPERVISED LEARNING,” International Journal of Neural Systems, vol. 03, no. 03, pp. 253–290, 1 1992. [Online]. Available: https://doi.org/10.1142/s0129065792000218
  2. Y. Huang, “Advances in artificial Neural Networks – Methodological development and application,” Algorithms, vol. 2, no. 3, pp. 973–1007, 8 2009. [Online]. Available: https://doi.org/10.3390/algor2030973
  3. S. E. Dreyfus, “Artificial neural networks, back propagation, and the Kelley-Bryson gradient procedure,” Journal of Guidance Control and Dynamics, vol. 13, no. 5, pp. 926–928, 9 1990. [Online]. Available: https://doi.org/10.2514/3.25422
  4. J. Ciesinger, “Neural nets,” SIGACT news, vol. 19, no. 3-4, pp. 46–47, 11 1988. [Online]. Available: https://doi.org/10.1145/58395.58398
  5. M. Chahine, R. Hasani, P. S. S. Kao, A. Ray, R. Shubert, M. Lechner, A. Amini, and D. Rus, “Robust flight navigation out of distribution with liquid neural networks,” Science robotics, vol. 8, no. 77, 4 2023. [Online]. Available: https://doi.org/10.1126/scirobotics.adc8892
  6. R. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant networks,” Proceedings of the … AAAI Conference on Artificial Intelligence, vol. 35, no. 9, pp. 7657–7666, 5 2021. [Online]. Available: https://doi.org/10.1609/aaai.v35i9.16936
  7. M. Lechner, R. Hasani, M. Zimmer, T. A. Henzinger, and R. Grosu, “Designing Worm-inspired Neural Networks for Interpretable Robotic Control,” IEEE, 5 2019. [Online]. Available: https://doi.org/10.1109/icra.2019.8793840
  8. P. K. Huynh, M. L. Varshika, A. Paul, M. Isik, A. Balaji, and A. Das, “Implementing spiking neural networks on neuromorphic architectures: A review,” arXiv preprint arXiv:2202.08897, 2022.
  9. M. Isik, K. Inadagbo, and H. Aktas, “Design optimization for high-performance computing using fpga,” in Annual International Conference on Information Management and Big Data.   Springer, 2023, pp. 142–156.
  10. M. Isik, H. Vishwamith, Y. Sur, K. Inadagbo, and I. C. Dikmen, “Neurosec: Fpga-based neuromorphic audio security,” in International Symposium on Applied Reconfigurable Computing.   Springer, 2024, pp. 134–147.
  11. R. M. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant recurrent neural networks as universal approximators,” 11 2018. [Online]. Available: https://arxiv.org/abs/1811.00321v1
  12. M. Bidollahkhani, F. Atasoy, and H. Abdellatef, “LTC-SE: Expanding the potential of Liquid Time-Constant Neural Networks for scalable AI and embedded systems,” arXiv (Cornell University), 4 2023. [Online]. Available: https://arxiv.org/abs/2304.08691
  13. L. Lapicque, “Recherches quantitatives sur l’excitation electrique des nerfs traitée comme une polarization,” Journal de Physiologie et Pathologie General, vol. 9, pp. 620–635, 1907.
  14. A. L. Hodgkin and A. Huxley, “A quantitative description of membrane current and its application to conduction and excitation in nerve,” The Journal of Physiology, vol. 117, no. 4, pp. 500–544, 8 1952. [Online]. Available: https://doi.org/10.1113/jphysiol.1952.sp004764
  15. R. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid time-constant networks,” arXiv (Cornell University), 6 2020. [Online]. Available: https://arxiv.org/abs/2006.04439
  16. N. Saha, A. Swetapadma, and M. Mondal, “A Brief Review on Artificial Neural Network: Network Structures and Applications,” IEEE, 3 2023. [Online]. Available: https://doi.org/10.1109/icaccs57279.2023.10112753
  17. M. H. Nielsen, C.-Y. Yeh, M. Shen, and M. Médard, “Blockage prediction in directional MMWAve links using liquid time constant network,” IEEE, 9 2023. [Online]. Available: https://doi.org/10.1109/irmmw-thz57677.2023.10299092
  18. M. Y. Niu, L. Horesh, and I. Chuang, “Recurrent neural networks in the eye of differential equations,” 4 2019. [Online]. Available: https://arxiv.org/abs/1904.12933
  19. H. Shang and B. Hu, “CALNET: LiDAR-Camera Online calibration with channel attention and Liquid Time-Constant Network,” 2022 26th International Conference on Pattern Recognition (ICPR), 8 2022. [Online]. Available: https://doi.org/10.1109/icpr56361.2022.9956145
  20. H. Yin, Y. Zhou, L. Cao, and Y. Xu, “Channel Prediction with Liquid Time-Constant Networks: An Online and Adaptive Approach,” 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), 9 2021. [Online]. Available: https://doi.org/10.1109/vtc2021-fall52928.2021.9625323
  21. L. Nye, “Digital twins for patient care via knowledge graphs and Closed-Form Continuous-Time liquid neural networks,” arXiv (Cornell University), 7 2023. [Online]. Available: https://arxiv.org/abs/2307.04772
  22. M. Davies, A. Wild, G. Orchard, Y. Sandamirskaya, G. A. F. Guerra, P. Joshi, P. Plank, and S. R. Risbud, “Advancing neuromorphic computing with loihi: A survey of results and outlook,” Proceedings of the IEEE, vol. 109, no. 5, pp. 911–934, 2021.
  23. J. Wu and M. Gu, “Perfecting liquid-state theories with machine intelligence,” The Journal of Physical Chemistry Letters, vol. 14, pp. 10 545–10 552, 2023.
  24. L. Deckers, I. J. Tsang, W. Van Leekwijck, and S. Latré, “Extended liquid state machines for speech recognition,” Frontiers in Neuroscience, vol. 16, p. 1023470, 2022.
  25. H. Hazan and L. M. Manevitz, “Topological constraints and robustness in liquid state machines,” Expert systems with applications, vol. 39, no. 2, pp. 1597–1606, 2 2012. [Online]. Available: https://doi.org/10.1016/j.eswa.2011.06.052
  26. M. Koo, G. Srinivasan, Y. Shim, and K. Roy, “SBSNN: Stochastic-BITS enabled binary spiking neural network with On-Chip learning for energy efficient neuromorphic computing at the edge,” IEEE transactions on circuits and systems. I, Regular papers (Print), vol. 67, no. 8, pp. 2546–2555, 8 2020. [Online]. Available: https://doi.org/10.1109/tcsi.2020.2979826
  27. R. Zhao, W. Song, W. Zhang, T. Xing, J.-H. Lin, M. Srivastava, R. Gupta, and Z. Zhang, “Accelerating binarized convolutional neural networks with software-programmable fpgas,” in Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays, 2017, pp. 15–24.
  28. S. K. Esser, P. A. Merolla, J. V. Arthur, A. S. Cassidy, R. Appuswamy, A. Andreopoulos, D. J. Berg, J. L. McKinstry, T. Melano, D. R. Barch et al., “From the cover: Convolutional networks for fast, energy-efficient neuromorphic computing,” Proceedings of the National Academy of Sciences of the United States of America, vol. 113, no. 41, p. 11441, 2016.
  29. D. Bankman, L. Yang, B. Moons, M. Verhelst, and B. Murmann, “An always-on 3.8 microj/86
  30. B. Moons, D. Bankman, L. Yang, B. Murmann, and M. Verhelst, “Binareye: An always-on energy-accuracy-scalable binary cnn processor with all memory on chip in 28nm cmos,” in 2018 IEEE Custom Integrated Circuits Conference (CICC).   IEEE, 2018, pp. 1–4.
  31. Y.-C. Hsu, A. Kosuge, R. Sumikawa, K. Shiba, M. Hamada, and T. Kuroda, “A fully synthesized 13.7 μ𝜇\muitalic_μj/prediction 88% accuracy cifar-10 single-chip data-reusing wired-logic processor using non-linear neural network,” in Proceedings of the 28th Asia and South Pacific Design Automation Conference, 2023, pp. 182–183.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube