Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SparseTrain: Exploiting Dataflow Sparsity for Efficient Convolutional Neural Networks Training (2007.13595v1)

Published 21 Jul 2020 in cs.CV and cs.AR

Abstract: Training Convolutional Neural Networks (CNNs) usually requires a large number of computational resources. In this paper, \textit{SparseTrain} is proposed to accelerate CNN training by fully exploiting the sparsity. It mainly involves three levels of innovations: activation gradients pruning algorithm, sparse training dataflow, and accelerator architecture. By applying a stochastic pruning algorithm on each layer, the sparsity of back-propagation gradients can be increased dramatically without degrading training accuracy and convergence rate. Moreover, to utilize both \textit{natural sparsity} (resulted from ReLU or Pooling layers) and \textit{artificial sparsity} (brought by pruning algorithm), a sparse-aware architecture is proposed for training acceleration. This architecture supports forward and back-propagation of CNN by adopting 1-Dimensional convolution dataflow. We have built %a simple compiler to map CNNs topology onto \textit{SparseTrain}, and a cycle-accurate architecture simulator to evaluate the performance and efficiency based on the synthesized design with $14nm$ FinFET technologies. Evaluation results on AlexNet/ResNet show that \textit{SparseTrain} could achieve about $2.7 \times$ speedup and $2.2 \times$ energy efficiency improvement on average compared with the original training process.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Pengcheng Dai (206 papers)
  2. Jianlei Yang (32 papers)
  3. Xucheng Ye (8 papers)
  4. Xingzhou Cheng (4 papers)
  5. Junyu Luo (30 papers)
  6. Linghao Song (17 papers)
  7. Yiran Chen (176 papers)
  8. Weisheng Zhao (143 papers)
Citations (20)
Youtube Logo Streamline Icon: https://streamlinehq.com