Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Climate-Invariant Machine Learning (2112.08440v5)

Published 14 Dec 2021 in cs.LG, physics.ao-ph, and physics.comp-ph

Abstract: Projecting climate change is a generalization problem: we extrapolate the recent past using physical models across past, present, and future climates. Current climate models require representations of processes that occur at scales smaller than model grid size, which have been the main source of model projection uncertainty. Recent ML algorithms hold promise to improve such process representations, but tend to extrapolate poorly to climate regimes they were not trained on. To get the best of the physical and statistical worlds, we propose a new framework - termed "climate-invariant" ML - incorporating knowledge of climate processes into ML algorithms, and show that it can maintain high offline accuracy across a wide range of climate conditions and configurations in three distinct atmospheric models. Our results suggest that explicitly incorporating physical knowledge into data-driven models of Earth system processes can improve their consistency, data efficiency, and generalizability across climate regimes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (13)
  1. Tom Beucler (31 papers)
  2. Pierre Gentine (51 papers)
  3. Janni Yuval (10 papers)
  4. Ankitesh Gupta (2 papers)
  5. Liran Peng (6 papers)
  6. Jerry Lin (9 papers)
  7. Sungduk Yu (16 papers)
  8. Stephan Rasp (15 papers)
  9. Fiaz Ahmed (3 papers)
  10. Paul A. O'Gorman (10 papers)
  11. J. David Neelin (6 papers)
  12. Nicholas J. Lutsko (1 paper)
  13. Michael Pritchard (20 papers)
Citations (52)