Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 150 tok/s
Gemini 2.5 Pro 50 tok/s Pro
GPT-5 Medium 31 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 105 tok/s Pro
Kimi K2 185 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Neuromorphic Programming: Emerging Directions for Brain-Inspired Hardware (2410.22352v1)

Published 15 Oct 2024 in cs.NE, cs.AI, cs.DC, cs.ET, and cs.PL

Abstract: The value of brain-inspired neuromorphic computers critically depends on our ability to program them for relevant tasks. Currently, neuromorphic hardware often relies on machine learning methods adapted from deep learning. However, neuromorphic computers have potential far beyond deep learning if we can only harness their energy efficiency and full computational power. Neuromorphic programming will necessarily be different from conventional programming, requiring a paradigm shift in how we think about programming. This paper presents a conceptual analysis of programming within the context of neuromorphic computing, challenging conventional paradigms and proposing a framework that aligns more closely with the physical intricacies of these systems. Our analysis revolves around five characteristics that are fundamental to neuromorphic programming and provides a basis for comparison to contemporary programming methods and languages. By studying past approaches, we contribute a framework that advocates for underutilized techniques and calls for richer abstractions to effectively instrument the new hardware class.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (62)
  1. M. M. Waldrop, “The chips are down for moore’s law,” Nature, vol. 530, no. 7589, pp. 144–147, 2016.
  2. J. L. Hennessy and D. A. Patterson, “A new golden age for computer architecture,” Communications of the ACM, vol. 62, no. 2, pp. 48–60, 2019.
  3. C. Edwards, “Moore's law,” Communications of the ACM, vol. 64, no. 2, pp. 12–14, 2021.
  4. Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
  5. C. Mead, “Neuromorphic engineering: In memory of Misha Mahowald,” Neural Computation, vol. 35, no. 3, p. 343–383, Feb. 2023.
  6. H. Jaeger, B. Noheda, and W. G. v. d. Wiel, “Toward a formal theory for computing machines made out of whatever physics offers,” Nature Communications, vol. 14, no. 1, 2023.
  7. G. Indiveri, B. Linares-Barranco, T. J. Hamilton, A. van Schaik, R. Etienne-Cummings, T. Delbruck, S.-C. Liu, P. Dudek, P. Häfliger, S. Renaud, J. Schemmel, G. Cauwenberghs, J. Arthur, K. Hynna, F. Folowosele, S. Saighi, T. Serrano-Gotarredona, J. Wijekoon, Y. Wang, and K. Boahen, “Neuromorphic silicon neuron circuits,” Frontiers in Neuroscience, vol. 5, 2011.
  8. C. Frenkel, D. Bol, and G. Indiveri, “Bottom-up and top-down neural processing systems design: Neuromorphic intelligence as the convergence of natural and artificial intelligence,” 2021.
  9. H. Jaeger, “Towards a generalized theory comprising digital, neuromorphic and unconventional computing,” Neuromorphic Computing and Engineering, 2021.
  10. C. D. Schuman, S. R. Kulkarni, M. Parsa, J. P. Mitchell, P. Date, and B. Kay, “Opportunities for neuromorphic computing algorithms and applications,” Nature Computational Science, vol. 2, no. 1, pp. 10–19, 2022.
  11. J. B. Aimone, “Neural algorithms and computing beyond moore's law,” Communications of the ACM, vol. 62, no. 4, pp. 110–110, 2019.
  12. S. Hooker, “The hardware lottery,” Commun. ACM, vol. 64, no. 12, p. 58–65, nov 2021.
  13. J. B. Aimone and O. Parekh, “The brain’s unique take on algorithms,” Nature Communications, vol. 14, no. 11, p. 4910, Aug. 2023.
  14. K. Boahen, “A neuromorph's prospectus,” Computing in Science & Engineering, vol. 19, no. 2, pp. 14–28, 2017.
  15. R. Sarpeshkar, “Analog versus digital: Extrapolating from electronics to neurobiology,” Neural Computation, vol. 10, no. 7, pp. 1601–1638, 1998.
  16. J. von Neumann, “Probabilistic logics and the synthesis of reliable organisms from unreliable components,” Automata Studies, pp. 43–98, 1956.
  17. A. Alaghi and J. P. Hayes, “Survey of stochastic computing,” ACM Transactions on Embedded Computing Systems, vol. 12, no. 2s, pp. 1–19, 2013.
  18. M. A. Giese, “Dynamic neural field theory for motion perception.”   Boston, MA: Springer US, 1999, dOI: 10.1007/978-1-4615-5581-0.
  19. P. Kanerva, “Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors,” Cognitive Computation, vol. 1, no. 2, pp. 139–159, 2009.
  20. J. E. Pedersen, S. Abreu, M. Jobst, G. Lenz, V. Fra, F. C. Bauer, D. R. Muir, P. Zhou, B. Vogginger, K. Heckel, G. Urgese, S. Shankar, T. C. Stewart, J. K. Eshraghian, and S. Sheik, “Neuromorphic Intermediate Representation: A Unified Instruction Set for Interoperable Brain-Inspired Computing,” Nov. 2023, arXiv:2311.14641 [cs].
  21. C. Horsman, S. Stepney, R. C. Wagner, and V. Kendon, “When does a physical system compute?” Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 470, no. 2169, p. 20140182, 2014.
  22. C. D. Schuman, T. E. Potok, R. M. Patton, J. D. Birdwell, M. E. Dean, G. S. Rose, and J. S. Plank, “A survey of neuromorphic computing and neural networks in hardware,” arXiv preprint, 2017.
  23. S. Abreu, I. Boikov, M. Goldmann, T. Jonuzi, A. Lupo, S. Masaad, L. Nguyen, E. Picco, G. Pourcel, A. Skalli, L. Talandier, B. Vettelschoss, E. A. Vlieg, A. Argyris, P. Bienstman, D. Brunner, J. Dambre, L. Daudet, J. D. Domenech, I. Fischer, F. Horst, S. Massar, C. R. Mirasso, B. J. Offrein, A. Rossi, M. C. Soriano, S. Sygletos, and S. K. Turitsyn, “A photonics perspective on computing with physical substrates,” Reviews in Physics, vol. 12, p. 100093, Dec. 2024.
  24. P. Wegner, “Why interaction is more powerful than algorithms,” Communications of the ACM, vol. 40, no. 5, pp. 80–91, 1997.
  25. C. E. Shannon, “Mathematical theory of the differential analyzer,” Journal of Mathematics and Physics, vol. 20, no. 1-4, pp. 337–354, 1941.
  26. J. Hasler, “Defining analog standard cell libraries for mixed-signal computing enabled through educational directions,” in 2020 IEEE International Symposium on Circuits and Systems (ISCAS).   IEEE, 2020.
  27. O. Michel, J.-P. Banâtre, P. Fradet, and J.-L. Giavitto, “Challenging Questions for the Rationale of Non-Classical Programming Languages,” International Journal of Unconventional Computing, vol. 2, pp. 337–347, 2006.
  28. R. W. Floyd, “The paradigms of programming,” Communications of the ACM, vol. 22, no. 8, pp. 455–460, 1979.
  29. P. V. Roy, “Programming paradigms for dummies: What every programmer should know,” in New computational paradigms for computer music, G. Assayag and A. Gerzso, Eds.   Éditions Delatour France (Le vallier), 2009, pp. 9–47.
  30. G. Grünert, “Unconventional programming: non-programmable systems,” Ph.D. dissertation, Friedrich-Schiller-Universität Jena, 2017.
  31. Z. Wan and P. Hudak, “Functional reactive programming from first principles,” ser. PLDI ’00.   New York, NY, USA: Association for Computing Machinery, May 2000, p. 242–252.
  32. A. D. Gordon, T. A. Henzinger, A. V. Nori, and S. K. Rajamani, “Probabilistic programming,” in Future of Software Engineering Proceedings.   ACM, 2014.
  33. R. Milner, “Elements of interaction,” Communications of the ACM, vol. 36, no. 1, pp. 78–89, 1993.
  34. R. Milner, J. Parrow, and D. Walker, “A calculus of mobile processes, i,” Information and Computation, vol. 100, no. 1, p. 1–40, Sep. 1992.
  35. S. Gulwani, O. Polozov, and R. Singh, “Program synthesis,” Foundations and Trends® in Programming Languages, vol. 4, no. 1-2, pp. 1–119, 2017.
  36. F. Rossi, P. van Beek, and T. Walsh, “Constraint programming,” in Handbook of Knowledge Representation.   Elsevier, 2008, ch. 4, pp. 181–211.
  37. S. Gulwani, J. Hernández-Orallo, E. Kitzelmann, S. H. Muggleton, U. Schmid, and B. Zorn, “Inductive programming meets the real world,” Communications of the ACM, vol. 58, no. 11, pp. 90–99, 2015.
  38. J. K. Eshraghian, M. Ward, E. O. Neftci, X. Wang, G. Lenz, G. Dwivedi, M. Bennamoun, D. S. Jeong, and W. D. Lu, “Training spiking neural networks using lessons from deep learning,” Proceedings of the IEEE, vol. 111, no. 9, pp. 1016–1054, 2023.
  39. M. Mitchell, J. P. Crutchfield, and P. T. Hraber, “Evolving cellular automata to perform computations: mechanisms and impediments,” Physica D: Nonlinear Phenomena, vol. 75, no. 1-3, pp. 361–391, 1994.
  40. J. Miller and K. Downing, “Evolution in materio: looking beyond the silicon box,” in Proceedings 2002 NASA/DoD Conference on Evolvable Hardware.   IEEE, 2002.
  41. G. Tanaka, T. Yamane, J. B. Héroux, R. Nakane, N. Kanazawa, S. Takeda, H. Numata, D. Nakano, and A. Hirose, “Recent advances in physical reservoir computing: A review,” Neural Networks, vol. 115, pp. 100–123, 2019.
  42. L. G. Wright, T. Onodera, M. M. Stein, T. Wang, D. T. Schachter, Z. Hu, and P. L. McMahon, “Deep physical neural networks trained with backpropagation,” Nature, vol. 601, no. 7894, pp. 549–555, 2022, publisher: Springer Science and Business Media LLC.
  43. J. Hasler, “Large-scale field-programmable analog arrays,” Proceedings of the IEEE, vol. 108, no. 8, pp. 1283–1302, 2020.
  44. E. Donati, M. Payvand, N. Risi, R. Krause, K. Burelo, G. Indiveri, T. Dalgaty, and E. Vianello, “Processing EMG signals using reservoir computing on an event-based neuromorphic system,” in 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS).   IEEE, 2018.
  45. D. Kirk, “Optimal control theory: an introduction,” 1970.
  46. M. Parashar and S. Hariri, “Autonomic computing: An overview,” in Lecture Notes in Computer Science.   Springer Berlin Heidelberg, 2005, pp. 257–269.
  47. C. Mead, “Neuromorphic electronic systems,” Proceedings of the IEEE, vol. 78, no. 10, pp. 1629–1636, 1990.
  48. M. Davies, A. Wild, G. Orchard, Y. Sandamirskaya, G. A. F. Guerra, P. Joshi, P. Plank, and S. R. Risbud, “Advancing neuromorphic computing with Loihi: A survey of results and outlook,” Proceedings of the IEEE, vol. 109, no. 5, pp. 911–934, 2021.
  49. Y. Zhang, P. Qu, Y. Ji, W. Zhang, G. Gao, G. Wang, S. Song, G. Li, W. Chen, W. Zheng, F. Chen, J. Pei, R. Zhao, M. Zhao, and L. Shi, “A system hierarchy for brain-inspired computing,” Nature, vol. 586, no. 7829, pp. 378–384, 2020.
  50. A. Basu, J. Acharya, T. Karnik, H. Liu, H. Li, J.-S. Seo, and C. Song, “Low-power, adaptive neuromorphic systems: Recent progress and future directions,” IEEE Journal on Emerging and Selected Topics in Circuits and Systems, vol. 8, no. 1, pp. 6–27, 2018.
  51. J. Jordan, M. Schmidt, W. Senn, and M. A. Petrovici, “Evolving interpretable plasticity for spiking networks,” eLife, vol. 10, p. e66273, oct 2021.
  52. B. Confavreux, E. J. Agnes, F. Zenke, T. Lillicrap, and T. P. Vogels, “A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network,” 34th Conference on Neural Information Processing Systems (NeurIPS 2020), 2020.
  53. S. J. Verzi, F. Rothganger, O. D. Parekh, T.-T. Quach, N. E. Miner, C. M. Vineyard, C. D. James, and J. B. Aimone, “Computing with spikes: The advantage of fine-grained timing,” Neural Computation, vol. 30, no. 10, pp. 2660–2690, 2018.
  54. K. E. Hamilton, T. M. Mintz, and C. D. Schuman, “Spike-based primitives for graph algorithms,” arXiv preprint, 2019.
  55. J. D. Smith, W. Severa, A. J. Hill, L. Reeder, B. Franke, R. B. Lehoucq, O. D. Parekh, and J. B. Aimone, “Solving a steady-state PDE using spiking networks and neuromorphic hardware,” in International Conference on Neuromorphic Systems 2020.   ACM, 2020.
  56. C. Bartolozzi, G. Indiveri, and E. Donati, “Embodied neuromorphic intelligence,” Nature Communications, vol. 13, no. 1, 2022.
  57. R. J. Douglas and K. A. Martin, “Recurrent neuronal circuits in the neocortex,” Current Biology, vol. 17, no. 13, pp. R496–R500, 2007.
  58. E. Neftci, J. Binas, U. Rutishauser, E. Chicca, G. Indiveri, and R. J. Douglas, “Synthesizing cognition in neuromorphic electronic systems,” Proceedings of the National Academy of Sciences, vol. 110, no. 37, pp. E3468–E3476, 2013.
  59. D. Gutierrez-Galan, T. Schoepe, J. P. Dominguez-Morales, A. Jimenez-Fernandez, E. Chicca, and A. Linares-Barranco, “An event-based digital time difference encoder model implementation for neuromorphic systems,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–15, 2021.
  60. R. Krause, J. J. A. van Bavel, C. Wu, M. A. Vos, A. Nogaret, and G. Indiveri, “Robust neuromorphic coupled oscillators for adaptive pacemakers,” Scientific Reports, vol. 11, no. 1, 2021.
  61. T. Bekolay, J. Bergstra, E. Hunsberger, T. DeWolf, T. C. Stewart, D. Rasmussen, X. Choo, A. R. Voelker, and C. Eliasmith, “Nengo: a python tool for building large-scale functional brain models,” Frontiers in Neuroinformatics, vol. 7, 2014.
  62. J. B. Aimone, W. Severa, and C. M. Vineyard, “Composing neural algorithms with fugu,” in Proceedings of the International Conference on Neuromorphic Systems.   ACM, 2019.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 4 tweets and received 5 likes.

Upgrade to Pro to view all of the tweets about this paper: