Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Survey of Neuromorphic Computing and Neural Networks in Hardware (1705.06963v1)

Published 19 May 2017 in cs.NE

Abstract: Neuromorphic computing has come to refer to a variety of brain-inspired computers, devices, and models that contrast the pervasive von Neumann computer architecture. This biologically inspired approach has created highly connected synthetic neurons and synapses that can be used to model neuroscience theories as well as solve challenging machine learning problems. The promise of the technology is to create a brain-like ability to learn and adapt, but the technical challenges are significant, starting with an accurate neuroscience model of how the brain works, to finding materials and engineering breakthroughs to build devices to support these models, to creating a programming framework so the systems can learn, to creating applications with brain-like capabilities. In this work, we provide a comprehensive survey of the research and motivations for neuromorphic computing over its history. We begin with a 35-year review of the motivations and drivers of neuromorphic computing, then look at the major research areas of the field, which we define as neuro-inspired models, algorithms and learning approaches, hardware and devices, supporting systems, and finally applications. We conclude with a broad discussion on the major research topics that need to be addressed in the coming years to see the promise of neuromorphic computing fulfilled. The goals of this work are to provide an exhaustive review of the research conducted in neuromorphic computing since the inception of the term, and to motivate further work by illuminating gaps in the field where new research is needed.

Citations (660)

Summary

  • The paper provides a comprehensive review of 35 years of neuromorphic computing evolution, focusing on hardware implementations and neuron models.
  • It categorizes diverse neuromorphic architectures and learning algorithms, outlining digital, analog, and mixed systems with spiking dynamics.
  • The research highlights emerging device-level components like memristors and charts future directions for real-time, low-power AI applications.

A Survey of Neuromorphic Computing and Neural Networks in Hardware

The paper, "A Survey of Neuromorphic Computing and Neural Networks in Hardware," provides a comprehensive analysis of the evolution and current state of neuromorphic computing, examining its motivations, models, algorithms, hardware implementations, and applications. The research traverses a 35-year history and references over 3,000 papers, offering a valuable resource for those in the computational neuroscience and hardware-based artificial intelligence fields.

Evolution and Drivers of Neuromorphic Computing

Neuromorphic computing, inspired by the architecture and functioning of the human brain, diverges significantly from the traditional von Neumann architecture. Key drivers include overcoming inherent limitations such as the von Neumann bottleneck, increased power demands, and the end of Moore's Law. Neuromorphic systems promise advantages like low power consumption, parallel processing, and real-time learning capabilities, making them attractive for future computational needs.

Neuromorphic Models

The paper categorizes neuromorphic models into various types, such as biologically-plausible, biologically-inspired, integrate-and-fire, and McCulloch-Pitts neuron models. These models differ in complexity and biological accuracy, ranging from detailed Hodgkin-Huxley implementations to simpler digital spiking neuron models. The selection often depends on the desired balance between complexity and biological fidelity.

Learning Algorithms

Neuromorphic systems utilize a variety of learning algorithms. Back-propagation remains prevalent for training traditional neural networks, although it is less suitable for real-time on-chip learning in neuromorphic devices. Algorithms more aligned with neuromorphic capabilities, such as Spike-Timing Dependent Plasticity (STDP), offer on-line and unsupervised learning, yet require further development for broader applicability.

Hardware Implementations

Neuromorphic hardware can be categorized into digital, analog, and mixed analog/digital systems. FPGA-based and custom ASIC designs like IBM's TrueNorth and the SpiNNaker platform exemplify digital implementations, whereas analog variants align more closely with neurobiological processes. Mixed analog/digital approaches offer a balance, combining analog efficiency with digital precision.

Advanced Device-Level Components

Emerging technologies such as memristors, phase-change memory, and spintronic devices are crucial in advancing neuromorphic computing. Memristors, in particular, are favored for their synaptic-like properties and adaptability for implementing STDP, although challenges such as device variability and sneak paths persist.

Supporting Systems and Applications

For practical deployment, neuromorphic systems require robust communication frameworks and supporting software to facilitate usability and integration. Applications span from sensory data processing in smart devices to real-time control in robotics, indicating strong potential across various domains.

Future Directions

The research identifies several key areas for further exploration:

  • Learning Algorithms: Development of novel, neuromorphic-specific training methods is critical.
  • Integration with Emerging Technologies: Collaborative efforts between neuroscientists, materials scientists, and computer engineers will be essential in leveraging cutting-edge materials for neuromorphic applications.
  • Application Expansion: Identifying and harnessing the unique strengths of neuromorphic systems in real-world applications remain pivotal.

This extensive survey highlights the profound implications neuromorphic computing holds for the future of AI, suggesting a paradigm shift from traditional computational architectures to those inspired by the very nature of biological intelligence.