Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Need for Ethical, Responsible, and Trustworthy Artificial Intelligence for Environmental Sciences (2112.08453v1)

Published 15 Dec 2021 in cs.CY, cs.AI, and cs.LG

Abstract: Given the growing use of AI and ML methods across all aspects of environmental sciences, it is imperative that we initiate a discussion about the ethical and responsible use of AI. In fact, much can be learned from other domains where AI was introduced, often with the best of intentions, yet often led to unintended societal consequences, such as hard coding racial bias in the criminal justice system or increasing economic inequality through the financial system. A common misconception is that the environmental sciences are immune to such unintended consequences when AI is being used, as most data come from observations, and AI algorithms are based on mathematical formulas, which are often seen as objective. In this article, we argue the opposite can be the case. Using specific examples, we demonstrate many ways in which the use of AI can introduce similar consequences in the environmental sciences. This article will stimulate discussion and research efforts in this direction. As a community, we should avoid repeating any foreseeable mistakes made in other domains through the introduction of AI. In fact, with proper precautions, AI can be a great tool to help {\it reduce} climate and environmental injustice. We primarily focus on weather and climate examples but the conclusions apply broadly across the environmental sciences.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Amy McGovern (11 papers)
  2. Imme Ebert-Uphoff (20 papers)
  3. David John Gagne II (11 papers)
  4. Ann Bostrom (2 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.