Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Probabilistic Deep Spiking Neural Systems Enabled by Magnetic Tunnel Junction (1605.04494v1)

Published 15 May 2016 in cs.ET

Abstract: Deep Spiking Neural Networks are becoming increasingly powerful tools for cognitive computing platforms. However, most of the existing literature on such computing models are developed with limited insights on the underlying hardware implementation, resulting in area and power expensive designs. Although several neuromimetic devices emulating neural operations have been proposed recently, their functionality has been limited to very simple neural models that may prove to be inefficient at complex recognition tasks. In this work, we venture into the relatively unexplored area of utilizing the inherent device stochasticity of such neuromimetic devices to model complex neural functionalities in a probabilistic framework in the time domain. We consider the implementation of a Deep Spiking Neural Network capable of performing high accuracy and low latency classification tasks where the neural computing unit is enabled by the stochastic switching behavior of a Magnetic Tunnel Junction. Simulation studies indicate an energy improvement of $20\times$ over a baseline CMOS design in $45nm$ technology.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Abhronil Sengupta (50 papers)
  2. Maryam Parsa (25 papers)
  3. Bing Han (74 papers)
  4. Kaushik Roy (265 papers)
Citations (90)

Summary

We haven't generated a summary for this paper yet.