Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variation-Resilient FeFET-Based In-Memory Computing Leveraging Probabilistic Deep Learning (2312.15444v2)

Published 24 Dec 2023 in cs.ET

Abstract: Reliability issues stemming from device level non-idealities of non-volatile emerging technologies like ferroelectric field-effect transistors (FeFET), especially at scaled dimensions, cause substantial degradation in the accuracy of In-Memory crossbar-based AI systems. In this work, we present a variation-aware design technique to characterize the device level variations and to mitigate their impact on hardware accuracy employing a Bayesian Neural Network (BNN) approach. An effective conductance variation model is derived from the experimental measurements of cycle-to-cycle (C2C) and device-to-device (D2D) variations performed on FeFET devices fabricated using 28 nm high-$k$ metal gate technology. The variations were found to be a function of different conductance states within the given programming range, which sharply contrasts earlier efforts where a fixed variation dispersion was considered for all conductance values. Such variation characteristics formulated for three different device sizes at different read voltages were provided as prior variation information to the BNN to yield a more exact and reliable inference. Near-ideal accuracy for shallow networks (MLP5 and LeNet models) on the MNIST dataset and limited accuracy decline by $\sim$3.8-16.1% for deeper AlexNet models on CIFAR10 dataset under a wide range of variations corresponding to different device sizes and read voltages, demonstrates the efficacy of our proposed device-algorithm co-design technique.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (28)
  1. I. Chakraborty, A. Jaiswal, A. K. Saha, S. K. Gupta, and K. Roy, “Pathways to efficient neuromorphic computing with non-volatile memory technologies,” Appl. Phys. Lett., vol. 7, no. 2, pp. 021308(1–30), Jun. 2020, doi: 10.1063/1.5113536.
  2. W. Haensch, A. Raghunathan, K. Roy, B. Chakrabarti, C. M. Phatak, C. Wang, and S. Guha, “Compute in-Memory with Non-Volatile Elements for Neural Networks: A Review from a Co-Design Perspective,” Adv. Mat., vol. 35, no. 37, pp. 2204944(1–32), Sep. 2023, doi: 10.1002/adma.202204944.
  3. M. Lederer, T. Kämpfe, Member, T. Ali, F. Müller, R. Olivo, R. Hoffmann, N. Laleni, and K. Seidel, “Ferroelectric field effect transistors as a synapse for neuromorphic application,” IEEE Trans. Electron Devices, vol. 68, no. 5, pp. 2295–2300, May 2021, doi: 10.1109/TED.2021.3068716.
  4. S. De, F. Müller, S. Thunder, S. Abdulazhanov, N. Laleni, M. Lederer, T. Ali, Y. Raffel, S. Dünkel, S. Mojumder, and A. Vardar, “28 nm HKMG-based current limited FeFET crossbar-array for inference application,” IEEE Trans. Electron Devices, vol. 69, no. 12, pp. 7194–7198, Dec. 2022, doi: 10.1109/TED.2022.3216973.
  5. Y.-S. Liu and P. Su, “Variability Analysis for Ferroelectric FET Nonvolatile Memories Considering Random Ferroelectric-Dielectric Phase Distribution,” IEEE Electron Dev. Lett., vol. 41, no. 3, pp. 369–372, Mar 2020, doi: 10.1109/LED.2020.2967423.
  6. H. Mulaosmanovic, J. Ocker, S. Müller, U. Schroeder, J. Müller, P. Polakowski, S. Flachowsky, R. van Bentum, T. Mikolajick, and S. Slesazeck, “Switching Kinetics in Nanoscale Hafnium Oxide Based Ferroelectric Field-Effect Transistors,” Appl. Matt. Interfaces, vol. 9, no. 4, pp. 3792–3798, Feb 2017, doi: 10.1021/acsami.6b13866.
  7. B. Liu, H. Li, Y. Chen, X. Li, Q. Wu, and T. Huang, “Vortex: Variation-aware training for memristor X-bar,” in Proc. 52nd Annual Design Automation Conference, Jun. 2015, pp. 1–6, doi: 10.1145/2744769.2744930.
  8. L. Chen, J. Li, Y. Chen, Q. Deng, J. Shen, X. Liang, and L. Jiang, “Accelerator-friendly Neural-network Training: Learning Variations and Defects in RRAM Crossbar,” in Proc. Design, Automation & Test in Europe Conference & Exhibition (DATE), Mar. 2017, pp. 19–24, doi: 10.23919/DATE.2017.7926952.
  9. S. Jin, S. Pei, and Y. Wang, “On Improving Fault Tolerance of Memristor Crossbar Based Neural Network Designs by Target Sparsifying,” in Proc. Design, Automation & Test in Europe Conference & Exhibition (DATE), Mar. 2020, pp. 91–96, doi: 10.23919/DATE48585.2020.9116187.
  10. Y. Long, X. She, and S. Mukhopadhyay, “Design of reliable DNN accelerator with un-reliable ReRAM,” in Proc. Design, Automation & Test in Europe Conference & Exhibition (DATE), May 2019, pp. 1769–1774, doi: 10.23919/DATE.2019.8715178.
  11. Y. Bi, Q. Xu, H. Geng, S. Chen, and Y. Kang, “Resist: Robust Network Training for Memristive Crossbar-Based Neuromorphic Computing Systems,” IEEE Trans. on Circuits and Systems II: Express Briefs, vol. 70, no. 6, pp. 2221–2225, Jan. 2023, doi: 10.1109/TCSII.2023.3236168.
  12. D. Gao, Q. Huang, G. L. Zhang, X. Yin, B. Li, U. Schlichtmann, and C. Zhuo, “Bayesian Inference Based Robust Computing on Memristor Crossbar,” in Proc. 58th ACM/IEEE Design Automation Conference (DAC), Dec. 2021, pp. 121–126, doi: 10.1109/DAC18074.2021.9586160.
  13. D. Gao, Z. Yang, Q. Huang, G. L. Zhang, X. Yin, B. Li, U. Schlichtmann, and C. Zhuo, “BRoCoM: A Bayesian Framework for Robust Computing on Memristor Crossbar,” IEEE Trans. on Computer-Aided Design of Integrated Circuits and Systems, vol. 42, no. 7, pp. 2136–2148, Oct. 2022, doi: 10.1109/TCAD.2022.3215071.
  14. C. Li, F. Müller, T. Ali, R. Olivo, M. Imani, S. Deng, C. Zhuo, T. Kämpfe, X. Yin, and K. Ni, “A scalable design of multi-bit ferroelectric content addressable memory for data-centric computing,” IEEE International Electron Devices Meeting (IEDM), Dec 2020, pp. 29–3, doi: 10.1109/IEDM13553.2020.9372119.
  15. Y. Zhu, G. L. Zhang, T. Wang, B. Li, Y. Shi, T. Y. Ho, and U. Schlichtmann, “Statistical training for neuromorphic computing using memristor-based crossbars considering process variations and noise,” in Proc. Design, Automation & Test in Europe Conference & Exhibition (DATE), Mar 2020, pp. 1590–1593, doi: 10.23919/DATE48585.2020.9116244.
  16. C. Blundell, J. Cornebise, K. Kavukcuoglu, and D. Wierstra, “Weight uncertainty in neural network,” in Proc. 32nd International Conference on Machine Learning, Jun 2015, pp. 1613–1622.
  17. X. Guo, X. Ma, F. Müller, K. Ni, T. Kämpfe, Y. Liu, V. Narayanan, and X. Li, “Ferroelectric FET-based strong physical unclonable function: a low-power, high-reliable and reconfigurable solution for Internet-of-Things security,” in ArXiv, Aug. 2022, doi: 10.48550/arXiv.2208.14678.
  18. K. Ni, W. Chakraborty, J. Smith, B. Grisafe, and S. Datta, “Fundamental Understanding and Control of Device-to-Device Variation in Deeply Scaled Ferroelectric FETs,” in Proc. Symposium on VLSI Technology, Jun. 2019, pp. T40–T41, doi: 10.23919/VLSIT.2019.8776497.
  19. S. Chatterjee, S. Thomann, K. Ni, Y. S. Chauhan, and H. Amrouch, “Comprehensive Variability Analysis in Dual-Port FeFET for Reliable Multi-Level-Cell Storage,” IEEE Trans. Electron Devices, vol. 69, no. 9, pp. 5316–5323, Sep. 2022, doi: 10.1109/TED.2022.3192808.
  20. C. Garg, N. Chauhan, S. Den, A. I. Khan, S. Dasgupta, A. Bulusu, and K. Ni, “Impact of Random Spatial Fluctuation in Non-Uniform Crystalline Phases on the Device Variation of Ferroelectric FET,” IEEE Electron Dev. Lett., vol. 42, no. 8, pp. 1160–1163, Jun 2021, doi: 10.1109/LED.2021.3087335.
  21. S. Deng, G. Yin, W. Chakraborty, S. Dutta, S. Datta, X. Li, and K. Ni, “A Comprehensive Model for Ferroelectric FET Capturing the Key Behaviors: Scalability, Variation, Stochasticity, and Accumulation,” in Proc. IEEE Symposium on VLSI Technology, Jun. 2020, pp. 1–2, doi: 10.1109/VLSITechnology18217.2020.9265014.
  22. A. Saha, A. N. M. N. Islam, Z. Zhao, S. Deng, K. Ni, and A. Sengupta, “Intrinsic synaptic plasticity of ferroelectric field effect transistors for online learning,” Appl. Phys. Lett., vol. 119, no. 13, pp. 133701(1–6), Sep. 2021, doi: 10.1063/5.0064860.
  23. A. N. M. N. Islam, A. Saha, Z. Jiang, K. Ni, and A. Sengupta, “Hybrid stochastic synapses enabled by scaled ferroelectric field-effect transistors,” Appl. Phys. Lett., vol. 122, no. 12, pp. 123701(1–7), Mar. 2023, doi: 10.1063/5.0132242.
  24. Q. Xu, J. Wang, H. Geng, S. Chen and X. Wen, “Reliability-Driven Neuromorphic Computing Systems Design,” in Proc. Design, Automation & Test in Europe Conference & Exhibition (DATE), Jul 2021, pp. 1586–1591, doi: 10.23919/DATE51398.2021.9473929.
  25. Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proc. IEEE, Nov 1998, pp. 2278–2324, doi: 10.1109/5.726791.
  26. A. Krizhevsky and G. Hinton, “Learning multiple layers of features from tiny images,” 2009. Available online at: https://www.cs.toronto.edu/ kriz/learning-features-2009-TR.pdf.
  27. D. P. Kingma, T. Salimans, and M. Welling, “Variational dropout and the local reparameterization trick,” in Conf. on Advances in neural information processing systems (NIPS), 2015, pp. 2575–2583.
  28. S. Kullback and R. A. Leibler, “On information and sufficiency,” The annals of mathematical statistics, vol. 22, no. 1, pp. 79–86, Mar. 1951, https://www.jstor.org/stable/2236703.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com