Electrical Tunable Spintronic Neuron with Trainable Activation Function (2211.13391v1)
Abstract: Spintronic devices have been widely studied for the hardware realization of artificial neurons. The stochastic switching of magnetic tunnel junction driven by the spin torque is commonly used to produce the sigmoid activation function. However, the shape of the activation function in previous studies is fixed during the training of neural network. This restricts the updating of weights and results in a limited performance. In this work, we exploit the physics behind the spin torque induced magnetization switching to enable the dynamic change of the activation function during the training process. Specifically, the pulse width and magnetic anisotropy can be electrically controlled to change the slope of activation function, which enables a faster or slower change of output required by the backpropagation algorithm. This is also similar to the idea of batch normalization that is widely used in the machine learning. Thus, this work demonstrates that the algorithms are no longer limited to the software implementation. They can in fact be realized by the spintronic hardware using a single device. Finally, we show that the accuracy of hand-written digit recognition can be improved from 88% to 91.3% by using these trainable spintronic neurons without introducing additional energy consumption. Our proposals can stimulate the hardware realization of spintronic neural networks.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.