Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantifying the Carbon Emissions of Machine Learning (1910.09700v2)

Published 21 Oct 2019 in cs.CY and cs.LG

Abstract: From an environmental standpoint, there are a few crucial aspects of training a neural network that have a major impact on the quantity of carbon that it emits. These factors include: the location of the server used for training and the energy grid that it uses, the length of the training procedure, and even the make and model of hardware on which the training takes place. In order to approximate these emissions, we present our Machine Learning Emissions Calculator, a tool for our community to better understand the environmental impact of training ML models. We accompany this tool with an explanation of the factors cited above, as well as concrete actions that individual practitioners and organizations can take to mitigate their carbon emissions.

Quantifying the Carbon Emissions of Machine Learning

The computing community has increasingly recognized the environmental impact of ML processes, particularly concerning carbon emissions during neural network training. This paper provides a quantitative analysis of the carbon emissions associated with ML model training, alongside a tool, the Machine Learning Emissions Calculator, designed to estimate these emissions. The authors focus on three primary factors influencing emissions: the energy source of the training servers, the duration of the training procedures, and the computational efficiency of the hardware used.

Key Factors Affecting Carbon Emissions

Energy Source Variability

The energy source at a given server's location significantly influences emission levels. The authors compiled emissions data for servers from major cloud providers, including Google Cloud Platform, Microsoft Azure, and Amazon Web Services, cross-referencing these with local energy grid data. The paper highlights stark differences in carbon intensity based on location, such as 20g CO₂eq/kWh in Quebec, Canada, vs. 736.6g CO₂eq/kWh in Iowa, USA. These results underscore the critical impact of server location on carbon emissions.

Hardware and Training Duration

The paper also evaluates the role of computing infrastructure, noting advancements in GPU capabilities from 100 GFLOPS in 2004 to up to 15 TFLOPS in modern hardware. Despite these improvements, more complex neural network models necessitate prolonged and resource-intensive training sessions on multiple GPUs, thus amplifying energy consumption. The authors propose alternative approaches, such as fine-tuning pre-trained models and employing random hyperparameter search, to reduce training times and conserve energy.

The Machine Learning Emissions Calculator

To address these concerns, the authors introduce the Machine Learning Emissions Calculator, a practical tool allowing researchers to estimate the carbon footprint of their training processes. By inputting details such as server location, GPU type, and training duration, practitioners can gain awareness of their environmental impact and adopt strategies to mitigate it.

Recommendations for Reducing Emissions

The paper outlines several actionable measures for mitigating carbon emissions in ML research:

  1. Select Environmentally Friendly Cloud Providers: Opt for providers with robust sustainability measures, such as those purchasing Renewable Energy Certificates (RECs) to offset carbon emissions.
  2. Use Energy-Efficient Data Centers: Choosing data centers in regions with low-carbon energy sources can significantly reduce emissions.
  3. Adopt Efficient Training Approaches: Avoid grid search for hyperparameter tuning, prefer random search, and conduct thorough literature reviews to minimize resource wastage.
  4. Utilize Efficient Hardware: Hardware selection can influence emissions, with TPUs providing higher GFLOPS/W compared to traditional GPUs, leading to energy savings.

Implications and Future Directions

This research prompts the ML community to integrate sustainability into their evaluation criteria, encouraging practices that balance scientific progress with environmental stewardship. Future efforts could extend beyond training to include model deployment emissions, recognizing the need for comprehensive lifecycle analyses of AI systems.

By quantifying and openly discussing carbon emissions, the authors contribute to an essential discourse on making AI more sustainable. Their emissions calculator represents a step toward actionable change, fostering an informed community equipped to make environmentally conscious decisions. This focus on sustainability is likely to influence future AI research agendas, stressing the importance of developing efficient computing paradigms that also consider ecological impact.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Alexandre Lacoste (42 papers)
  2. Alexandra Luccioni (11 papers)
  3. Victor Schmidt (16 papers)
  4. Thomas Dandres (2 papers)
Citations (614)
X Twitter Logo Streamline Icon: https://streamlinehq.com
Youtube Logo Streamline Icon: https://streamlinehq.com