Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision (2108.13465v2)

Published 30 Aug 2021 in cs.CV and cs.LG

Abstract: The energy consumption of deep learning models is increasing at a breathtaking rate, which raises concerns due to potential negative effects on carbon neutrality in the context of global warming and climate change. With the progress of efficient deep learning techniques, e.g., model compression, researchers can obtain efficient models with fewer parameters and smaller latency. However, most of the existing efficient deep learning methods do not explicitly consider energy consumption as a key performance indicator. Furthermore, existing methods mostly focus on the inference costs of the resulting efficient models, but neglect the notable energy consumption throughout the entire life cycle of the algorithm. In this paper, we present the first large-scale energy consumption benchmark for efficient computer vision models, where a new metric is proposed to explicitly evaluate the full-cycle energy consumption under different model usage intensity. The benchmark can provide insights for low carbon emission when selecting efficient deep learning algorithms in different model usage scenarios.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Bo Li (1107 papers)
  2. Xinyang Jiang (40 papers)
  3. Donglin Bai (4 papers)
  4. Yuge Zhang (12 papers)
  5. Ningxin Zheng (15 papers)
  6. Xuanyi Dong (28 papers)
  7. Lu Liu (464 papers)
  8. Yuqing Yang (83 papers)
  9. Dongsheng Li (240 papers)
Citations (8)