Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 133 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 34 tok/s Pro
GPT-4o 61 tok/s Pro
Kimi K2 194 tok/s Pro
GPT OSS 120B 430 tok/s Pro
Claude Sonnet 4.5 39 tok/s Pro
2000 character limit reached

Power efficient ReLU design for neuromorphic computing using spin Hall effect (2303.06463v1)

Published 11 Mar 2023 in cond-mat.mes-hall

Abstract: We demonstrate a magnetic tunnel junction injected with spin Hall current to exhibit linear rotation of magnetization of the free-ferromagnet using only the spin current. Using the linear resistance change of the MTJ, we devise a circuit for the rectified linear activation (ReLU) function of the artificial neuron. We explore the role of different spin Hall effect (SHE) heavy metal layers on the power consumption of the ReLU circuit. We benchmark the power consumption of the ReLU circuit with different SHE layers by defining a new parameter called the spin Hall power factor. It combines the spin Hall angle, resistivity, and thickness of the heavy metal layer, which translates to the power consumption of the different SHE layers during spin-orbit switching/rotation of the free FM. We employ a hybrid spintronics-CMOS simulation framework that couples Keldysh non-equilibrium Green's function formalism with Landau-Lifshitz-Gilbert-Slonzewski equations and the HSPICE circuit simulator to account for diverse physics of spin-transport and the CMOS elements in our proposed ReLU design. We also demonstrate the robustness of the proposed ReLU circuit against thermal noise and non-trivial power-error trade-off that enables the use of an unstable free-ferromagnet for energy-efficient design. Using the proposed circuit, we evaluate the performance of the convolutional neural network for MNIST datasets and demonstrate comparable classification accuracies to the ideal ReLU with an energy consumption of 75 $pJ$ per sample.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.