Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Quantum Leaky Integrate-and-Fire Spiking Neuron and Network (2407.16398v1)

Published 23 Jul 2024 in quant-ph, cs.ET, and cs.NE

Abstract: Quantum machine learning is in a period of rapid development and discovery, however it still lacks the resources and diversity of computational models of its classical complement. With the growing difficulties of classical models requiring extreme hardware and power solutions, and quantum models being limited by noisy intermediate-scale quantum (NISQ) hardware, there is an emerging opportunity to solve both problems together. Here we introduce a new software model for quantum neuromorphic computing -- a quantum leaky integrate-and-fire (QLIF) neuron, implemented as a compact high-fidelity quantum circuit, requiring only 2 rotation gates and no CNOT gates. We use these neurons as building blocks in the construction of a quantum spiking neural network (QSNN), and a quantum spiking convolutional neural network (QSCNN), as the first of their kind. We apply these models to the MNIST, Fashion-MNIST, and KMNIST datasets for a full comparison with other classical and quantum models. We find that the proposed models perform competitively, with comparative accuracy, with efficient scaling and fast computation in classical simulation as well as on quantum devices.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (57)
  1. M. M. Waldrop, “The chips are down for Moore’s law,” Nature 530 (2016) . https://doi.org/10.1038/530144a.
  2. I. A. T. Hashem et al., “The rise of “big data” on cloud computing: Review and open research issues,” Information Systems 47 (Jan., 2015) 98–115. https://linkinghub.elsevier.com/retrieve/pii/S0306437914001288.
  3. J. Biamonte, “Quantum machine learning,” Nature 549 (2017) . https://doi.org/10.1038/nature23474.
  4. M. Schuld, I. Sinayskiy, and F. Petruccione, “An introduction to quantum machine learning,” Contemporary Physics 56 no. 2, (Apr., 2015) 172–185. http://www.tandfonline.com/doi/full/10.1080/00107514.2014.964942.
  5. F. Tacchino, C. Macchiavello, D. Gerace, and D. Bajoni, “An artificial neuron implemented on an actual quantum processor,” npj Quantum Inf. 5 (2019) . https://doi.org/10.1038/s41534-019-0140-4.
  6. A. Peruzzo, “A variational eigenvalue solver on a photonic quantum processor,” Nat. Commun. 5 (2014) . https://doi.org/10.1038/ncomms5213.
  7. M. Cerezo et al., “Variational quantum algorithms,” Nature Reviews Physics 3 no. 9, (Aug., 2021) 625–644. https://www.nature.com/articles/s42254-021-00348-9.
  8. J. Preskill, “Quantum computing in the NISQ era and beyond,” Quantum 2 (2018) . https://doi.org/10.22331/q-2018-08-06-79.
  9. B. Cheng et al., “Noisy intermediate-scale quantum computers,” Frontiers of Physics 18 no. 2, (Mar., 2023) 21308. https://doi.org/10.1007/s11467-022-1249-z.
  10. H.-P. Breuer and F. Petruccione, The theory of open quantum systems. Clarendon Press, Oxford, repr ed., 2010.
  11. L. F. W. Anthony, B. Kanding, and R. Selvan, “Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models,” July, 2020. http://arxiv.org/abs/2007.03051. arXiv:2007.03051 [cs, eess, stat].
  12. S. A. Budennyy et al., “eco2AI: Carbon Emissions Tracking of Machine Learning Models as the First Step Towards Sustainable AI,” Doklady Mathematics 106 no. S1, (Dec., 2022) S118–S128. https://link.springer.com/10.1134/S1064562422060230.
  13. J. K. Eshraghian et al., “Training Spiking Neural Networks Using Lessons From Deep Learning,” Aug., 2023. http://arxiv.org/abs/2109.12894. arXiv:2109.12894 [cs].
  14. D. Marković, A. Mizrahi, D. Querlioz, and J. Grollier, “Physics for neuromorphic computing,” Nature Reviews Physics 2 no. 9, (Sept., 2020) 499–510. Publisher: Springer Nature.
  15. A. L. Hodgkin and A. F. Huxley, “A quantitative description of membrane current and its application to conduction and excitation in nerve,” The Journal of Physiology 117 no. 4, (Aug., 1952) 500–544. https://physoc.onlinelibrary.wiley.com/doi/10.1113/jphysiol.1952.sp004764.
  16. X. Zou, S. Xu, X. Chen, L. Yan, and Y. Han, “Breaking the von Neumann bottleneck: architecture-level processing-in-memory technology,” Science China Information Sciences 64 no. 6, (June, 2021) 160404. https://link.springer.com/10.1007/s11432-020-3227-1.
  17. B. V. Benjamin, “Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations,” Proc. IEEE 102 (2014) . https://doi.org/10.1109/JPROC.2014.2313565.
  18. P. A. Merolla, “A million spiking-neuron integrated circuit with a scalable communication network and interface,” Science 345 (2014) . https://doi.org/10.1126/science.1254642.
  19. M. Davies, “Loihi: a neuromorphic manycore processor with on-chip learning,” IEEE Micro 38 (2018) . https://doi.org/10.1109/MM.2018.112130359.
  20. W. Maass, “Networks of spiking neurons: the third generation of neural network models,” Neural Netw. 10 (1997) . https://doi.org/10.1016/S0893-6080(97)00011-7.
  21. K. Roy, A. Jaiswal, and P. Panda, “Towards spike-based machine intelligence with neuromorphic computing,” Nature 575 (2019) . https://doi.org/10.1038/s41586-019-1677-2.
  22. S. S. Haykin, Neural networks and learning machines. Prentice Hall, New York, 3rd ed., 2009. OCLC: ocn237325326.
  23. N. Brunel and M. C. W. Van Rossum, “Lapicque’s 1907 paper: from frogs to integrate-and-fire,” Biological Cybernetics 97 no. 5-6, (Dec., 2007) 337–339. http://link.springer.com/10.1007/s00422-007-0190-0.
  24. A. N. Burkitt, “A Review of the Integrate-and-fire Neuron Model: I. Homogeneous Synaptic Input,” Biological Cybernetics 95 no. 1, (July, 2006) 1–19. http://link.springer.com/10.1007/s00422-006-0068-6.
  25. K. Yamazaki, V.-K. Vo-Ho, D. Bulsara, and N. Le, “Spiking Neural Networks and Their Applications: A Review,” Brain Sciences 12 no. 7, (June, 2022) 863. https://www.mdpi.com/2076-3425/12/7/863.
  26. X. Wang, X. Lin, and X. Dang, “Supervised learning in spiking neural networks: A review of algorithms and evaluations,” Neural Networks 125 (May, 2020) 258–280. https://linkinghub.elsevier.com/retrieve/pii/S0893608020300563.
  27. P. Pfeiffer, I. L. Egusquiza, M. DI Ventra, M. Sanz, and E. Solano, “Quantum memristors,” Scientific Reports 6 (July, 2016) . Publisher: Nature Publishing Group.
  28. J. Salmilehto, F. Deppe, M. Di Ventra, M. Sanz, and E. Solano, “Quantum Memristors with Superconducting Circuits,” Scientific Reports 7 no. 1, (2017) 42044–42044. MAG ID: 3098960801.
  29. E. Prati, “Quantum neuromorphic hardware for quantum artificial intelligence,” Journal of Physics: Conference Series 880 no. 1, (Aug., 2017) 012018. https://dx.doi.org/10.1088/1742-6596/880/1/012018.
  30. M. Sanz, L. Lamata, and E. Solano, “Invited Article: Quantum memristors in quantum photonics,” APL Photonics 3 no. 8, (Aug., 2018) . Publisher: AIP Publishing.
  31. Guo, Y. -M., Albarrán-Arriagada, F., Alaeian, H., Solano, E., and Barrios, G. Alvarado, “Quantum Memristors with Quantum Computers,” Physical review applied 18 no. 2, (Dec., 2021) . MAG ID: 4226411502.
  32. D. Marković and J. Grollier, “Quantum neuromorphic computing,” Applied Physics Letters 117 no. 15, (Oct., 2020) . Publisher: American Institute of Physics Inc.
  33. L. B. Kristensen, M. Degroote, P. Wittek, A. Aspuru-Guzik, and N. T. Zinner, “An artificial spiking quantum neuron,” npj Quantum Information 7 no. 1, (Apr., 2021) 59. https://www.nature.com/articles/s41534-021-00381-7.
  34. Y. Chen, C. Wang, H. Guo, X. Gao, and J. Wu, “Accelerating spiking neural networks using quantum algorithm with high success probability and high calculation accuracy,” Neurocomputing 493 (July, 2022) 435–444. https://linkinghub.elsevier.com/retrieve/pii/S0925231222001473.
  35. D. Brand, I. Sinayskiy, and F. Petruccione, “Markovian noise modelling and parameter extraction framework for quantum devices,” Scientific Reports 14 no. 1, (Feb., 2024) 4769. https://www.nature.com/articles/s41598-024-54598-5.
  36. M. Schuld and F. Petruccione, Machine Learning with Quantum Computers. Quantum Science and Technology. Springer International Publishing, Cham, 2021. https://link.springer.com/10.1007/978-3-030-83098-4.
  37. A. Borst and F. E. Theunissen, “Information theory and neural coding,” Nature Neuroscience 2 no. 11, (Nov., 1999) 947–957. https://www.nature.com/articles/nn1199_947.
  38. S. Panzeri, N. Brunel, N. K. Logothetis, and C. Kayser, “Sensory neural codes using multiplexed temporal scales,” Trends in Neurosciences 33 no. 3, (Mar., 2010) 111–120. https://linkinghub.elsevier.com/retrieve/pii/S0166223609002008.
  39. Qiang Yu, Huajin Tang, K. C. Tan, and Haizhou Li, “Rapid Feedforward Computation by Temporal Encoding and Learning With Spiking Neurons,” IEEE Transactions on Neural Networks and Learning Systems 24 no. 10, (Oct., 2013) 1539–1552. http://ieeexplore.ieee.org/document/6469239/.
  40. R. Kempter, W. Gerstner, and J. van Hemmen, “Spike-based compared to rate-based hebbian learning,” in Advances in Neural Information Processing Systems, M. Kearns, S. Solla, and D. Cohn, eds., vol. 11. MIT Press, 1998. https://proceedings.neurips.cc/paper_files/paper/1998/file/42a3964579017f3cb42b26605b9ae8ef-Paper.pdf.
  41. K. Malcolm and J. Casco-Rodriguez, “A Comprehensive Review of Spiking Neural Networks: Interpretation, Optimization, Efficiency, and Best Practices,” Mar., 2023. http://arxiv.org/abs/2303.10780. arXiv:2303.10780 [cs, eess].
  42. S. M. Bohte, J. N. Kok, and H. La Poutré, “Error-backpropagation in temporally encoded networks of spiking neurons,” Neurocomputing 48 no. 1, (Oct., 2002) 17–37. https://www.sciencedirect.com/science/article/pii/S0925231201006580.
  43. E. O. Neftci, H. Mostafa, and F. Zenke, “Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks,” IEEE Signal Processing Magazine 36 no. 6, (Nov., 2019) 51–63. https://ieeexplore.ieee.org/document/8891809/.
  44. B. Cramer et al., “Surrogate gradients for analog neuromorphic computing,” Proceedings of the National Academy of Sciences 119 no. 4, (Jan., 2022) e2109194119. https://pnas.org/doi/full/10.1073/pnas.2109194119.
  45. Li Deng, “The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web],” IEEE Signal Processing Magazine 29 no. 6, (Nov., 2012) 141–142. http://ieeexplore.ieee.org/document/6296535/.
  46. H. Xiao, K. Rasul, and R. Vollgraf, “Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms,” Sept., 2017. http://arxiv.org/abs/1708.07747. arXiv:1708.07747 [cs, stat].
  47. T. Clanuwat et al., “Deep Learning for Classical Japanese Literature,” Nov., 2018. http://arxiv.org/abs/1812.01718. arXiv:1812.01718 [cs, stat].
  48. V. Bergholm et al., “PennyLane: Automatic differentiation of hybrid quantum-classical computations,” July, 2022. http://arxiv.org/abs/1811.04968. arXiv:1811.04968 [physics, physics:quant-ph].
  49. A. Paszke et al., “PyTorch: An Imperative Style, High-Performance Deep Learning Library,” Dec., 2019. http://arxiv.org/abs/1912.01703. arXiv:1912.01703 [cs, stat].
  50. M. Schuld, A. Bocharov, K. M. Svore, and N. Wiebe, “Circuit-centric quantum classifiers,” Physical Review A 101 no. 3, (Mar., 2020) 032308. https://link.aps.org/doi/10.1103/PhysRevA.101.032308.
  51. M. Henderson, S. Shakya, S. Pradhan, and T. Cook, “Quanvolutional neural networks: powering image recognition with quantum circuits,” Quantum Machine Intelligence 2 no. 1, (June, 2020) 2. http://link.springer.com/10.1007/s42484-020-00012-y.
  52. D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” Jan., 2017. http://arxiv.org/abs/1412.6980. arXiv:1412.6980 [cs].
  53. G. Orchard, A. Jayawant, G. K. Cohen, and N. Thakor, “Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades,” Frontiers in Neuroscience 9 (Nov., 2015) . http://journal.frontiersin.org/Article/10.3389/fnins.2015.00437/abstract.
  54. H. H. See et al., “ST-MNIST – The Spiking Tactile MNIST Neuromorphic Dataset,” May, 2020. http://arxiv.org/abs/2005.04319. arXiv:2005.04319 [cs].
  55. B. Cramer, Y. Stradmann, J. Schemmel, and F. Zenke, “The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks,” IEEE Transactions on Neural Networks and Learning Systems 33 no. 7, (July, 2022) 2744–2757. https://ieeexplore.ieee.org/document/9311226/.
  56. E. Mueggler, H. Rebecq, G. Gallego, T. Delbruck, and D. Scaramuzza, “The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM,” The International Journal of Robotics Research 36 no. 2, (Feb., 2017) 142–149. http://journals.sagepub.com/doi/10.1177/0278364917691115.
  57. Y. Cherdo, B. Miramond, and A. Pegatoquet, “Time series prediction and anomaly detection with recurrent spiking neural networks,” in 2023 International Joint Conference on Neural Networks (IJCNN), pp. 1–10. IEEE, Gold Coast, Australia, June, 2023. https://ieeexplore.ieee.org/document/10191614/.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets