Papers
Topics
Authors
Recent
Search
2000 character limit reached

Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models

Published 6 Apr 2023 in cs.LG and cs.AI | (2304.03271v5)

Abstract: The growing carbon footprint of AI has been undergoing public scrutiny. Nonetheless, the equally important water (withdrawal and consumption) footprint of AI has largely remained under the radar. For example, training the GPT-3 LLM in Microsoft's state-of-the-art U.S. data centers can directly evaporate 700,000 liters of clean freshwater, but such information has been kept a secret. More critically, the global AI demand is projected to account for 4.2-6.6 billion cubic meters of water withdrawal in 2027, which is more than the total annual water withdrawal of 4-6 Denmark or half of the United Kingdom. This is concerning, as freshwater scarcity has become one of the most pressing challenges. To respond to the global water challenges, AI can, and also must, take social responsibility and lead by example by addressing its own water footprint. In this paper, we provide a principled methodology to estimate the water footprint of AI, and also discuss the unique spatial-temporal diversities of AI's runtime water efficiency. Finally, we highlight the necessity of holistically addressing water footprint along with carbon footprint to enable truly sustainable AI.

Citations (71)

Summary

  • The paper quantifies AI's hidden water consumption, demonstrating that models like GPT-3 can consume millions of liters during training.
  • It introduces a methodology that distinguishes between operational water for cooling and embodied water in AI infrastructure production.
  • The study recommends increased transparency and location-based workload scheduling to balance both water and carbon footprints in AI systems.

Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models

Introduction

The paper explores the hidden environmental costs of AI models, focusing on their water consumption, which has been largely neglected compared to carbon emissions. It emphasizes that the resource-intensive nature of AI models like GPT-3 could lead to significant water consumption, with estimates of up to 700,000 liters evaporated for training in Microsoft's state-of-the-art data centers. Moreover, the global AI demand could be responsible for 4.2-6.6 billion cubic meters of water withdrawal by 2027, surpassing the water usage of entire countries.

Motivation and Context:

AI models are primarily deployed in data centers, notorious for their high electricity and water usage. An emerging concern is the substantial water footprint—comprising both withdrawal and consumption—linked to the energy production necessary to power these models. The study highlights the urgent need to align AI's development with environmental sustainability, particularly given the existing water scarcity affecting billions of people worldwide.

Methodology for Estimating Water Footprint

The paper proposes a structured methodology to assess AI models' water footprint by considering both operational and embodied water usage.

Operational Water Footprint:

  • Scope-1 (On-site Water for Cooling): Utilizes water for direct server cooling, divided among cooling towers and outside air cooling systems. Factors like evaporation and climate significantly influence this footprint.
  • Scope-2 (Off-site Water for Electricity Generation): Primarily derives from the energy fuel mix used for electricity production. The efficiency of water usage varies by region based on local energy sources. Figure 1

Figure 1

Figure 1

Figure 1: The U.S. eGRID-level scope-2 water consumption intensity factor vs. carbon emission rate.

Embodied Water Footprint:

Refers to the water consumed in the manufacturing processes of AI infrastructure, such as semiconductor fabrication. It estimates water usage amortized over the lifespan of the servers.

Case Study: GPT-3's Water Consumption

A case study on GPT-3 quantifies its operational water consumption footprint. The analysis considers variations across geographic locations, reflecting differences in energy mix and cooling technologies. For GPT-3, an estimated 5.4 million liters of water are consumed for training, underlining the significant impact of large AI models.

  • Training Water Consumption: Varies globally due to differences in PUE and WUE across data centers.
  • Inference Water Consumption: Relies on temporal and spatial distribution of inference tasks, demonstrating variability in resource usage efficiency.

Findings and Recommendations

Spatial-Temporal Diversity:

The study illustrates significant variability in water efficiency based on timing and location, highlighting opportunities to reduce water footprints by strategically scheduling AI workloads.

Transparency and Reporting:

The paper advocates for improved transparency in AI models' water usage, proposing that water data be included in AI model reports, similar to carbon data.

Balancing Water and Carbon Footprints:

There is often a mismatch between strategies to minimize water and carbon footprints due to different peak efficiency times. The authors stress the need for holistic approaches that consider both environmental impacts to achieve sustainable AI development.

Conclusion

The paper concludes by emphasizing the untapped potential to optimize AI models for water efficiency, alongside carbon efficiency. It calls for increased awareness and accountability in AI development concerning water usage, ensuring that AI models contribute positively to environmental sustainability.

The study sets a precedent for future research into AI's environmental impacts, advocating for integrated strategies that address both carbon and water footprints, positioning AI as a leader in socially responsible and environmentally sustainable technology.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 230 tweets with 559074 likes about this paper.