Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Survey on Green Deep Learning (2111.05193v2)

Published 8 Nov 2021 in cs.LG and cs.CL

Abstract: In recent years, larger and deeper models are springing up and continuously pushing state-of-the-art (SOTA) results across various fields like NLP and computer vision (CV). However, despite promising results, it needs to be noted that the computations required by SOTA models have been increased at an exponential rate. Massive computations not only have a surprisingly large carbon footprint but also have negative effects on research inclusiveness and deployment on real-world applications. Green deep learning is an increasingly hot research field that appeals to researchers to pay attention to energy usage and carbon emission during model training and inference. The target is to yield novel results with lightweight and efficient technologies. Many technologies can be used to achieve this goal, like model compression and knowledge distillation. This paper focuses on presenting a systematic review of the development of Green deep learning technologies. We classify these approaches into four categories: (1) compact networks, (2) energy-efficient training strategies, (3) energy-efficient inference approaches, and (4) efficient data usage. For each category, we discuss the progress that has been achieved and the unresolved challenges.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jingjing Xu (80 papers)
  2. Wangchunshu Zhou (73 papers)
  3. Zhiyi Fu (10 papers)
  4. Hao Zhou (351 papers)
  5. Lei Li (1293 papers)
Citations (73)