Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Achieving Conservation of Energy in Neural Network Emulators for Climate Modeling (1906.06622v1)

Published 15 Jun 2019 in physics.ao-ph, cs.LG, and physics.comp-ph

Abstract: Artificial neural-networks have the potential to emulate cloud processes with higher accuracy than the semi-empirical emulators currently used in climate models. However, neural-network models do not intrinsically conserve energy and mass, which is an obstacle to using them for long-term climate predictions. Here, we propose two methods to enforce linear conservation laws in neural-network emulators of physical models: Constraining (1) the loss function or (2) the architecture of the network itself. Applied to the emulation of explicitly-resolved cloud processes in a prototype multi-scale climate model, we show that architecture constraints can enforce conservation laws to satisfactory numerical precision, while all constraints help the neural-network better generalize to conditions outside of its training set, such as global warming.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Tom Beucler (31 papers)
  2. Stephan Rasp (15 papers)
  3. Michael Pritchard (20 papers)
  4. Pierre Gentine (51 papers)
Citations (80)