Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Implementing Binarized Neural Networks with Magnetoresistive RAM without Error Correction (1908.04085v1)

Published 12 Aug 2019 in cs.ET and cs.NE

Abstract: One of the most exciting applications of Spin Torque Magnetoresistive Random Access Memory (ST-MRAM) is the in-memory implementation of deep neural networks, which could allow improving the energy efficiency of Artificial Intelligence by orders of magnitude with regards to its implementation on computers and graphics cards. In particular, ST-MRAM could be ideal for implementing Binarized Neural Networks (BNNs), a type of deep neural networks discovered in 2016, which can achieve state-of-the-art performance with a highly reduced memory footprint with regards to conventional artificial intelligence approaches. The challenge of ST-MRAM, however, is that it is prone to write errors and usually requires the use of error correction. In this work, we show that these bit errors can be tolerated by BNNs to an outstanding level, based on examples of image recognition tasks (MNIST, CIFAR-10 and ImageNet): bit error rates of ST-MRAM up to 0.1% have little impact on recognition accuracy. The requirements for ST-MRAM are therefore considerably relaxed for BNNs with regards to traditional applications. By consequence, we show that for BNNs, ST-MRAMs can be programmed with weak (low-energy) programming conditions, without error correcting codes. We show that this result can allow the use of low energy and low area ST-MRAM cells, and show that the energy savings at the system level can reach a factor two.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Tifenn Hirtzlin (21 papers)
  2. Bogdan Penkovsky (12 papers)
  3. Jacques-Olivier Klein (15 papers)
  4. Nicolas Locatelli (17 papers)
  5. Adrien F. Vincent (5 papers)
  6. Marc Bocquet (46 papers)
  7. Jean-Michel Portal (15 papers)
  8. Damien Querlioz (62 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.