Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FourCastNet: Accelerating Global High-Resolution Weather Forecasting using Adaptive Fourier Neural Operators (2208.05419v1)

Published 8 Aug 2022 in physics.ao-ph, cs.AI, cs.CV, cs.LG, and cs.PF

Abstract: Extreme weather amplified by climate change is causing increasingly devastating impacts across the globe. The current use of physics-based numerical weather prediction (NWP) limits accuracy due to high computational cost and strict time-to-solution limits. We report that a data-driven deep learning Earth system emulator, FourCastNet, can predict global weather and generate medium-range forecasts five orders-of-magnitude faster than NWP while approaching state-of-the-art accuracy. FourCast-Net is optimized and scales efficiently on three supercomputing systems: Selene, Perlmutter, and JUWELS Booster up to 3,808 NVIDIA A100 GPUs, attaining 140.8 petaFLOPS in mixed precision (11.9%of peak at that scale). The time-to-solution for training FourCastNet measured on JUWELS Booster on 3,072GPUs is 67.4minutes, resulting in an 80,000times faster time-to-solution relative to state-of-the-art NWP, in inference. FourCastNet produces accurate instantaneous weather predictions for a week in advance, enables enormous ensembles that better capture weather extremes, and supports higher global forecast resolutions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Thorsten Kurth (43 papers)
  2. Shashank Subramanian (23 papers)
  3. Peter Harrington (22 papers)
  4. Jaideep Pathak (20 papers)
  5. Morteza Mardani (42 papers)
  6. David Hall (35 papers)
  7. Andrea Miele (4 papers)
  8. Karthik Kashinath (23 papers)
  9. Animashree Anandkumar (81 papers)
Citations (131)

Summary

  • The paper introduces FourCastNet, which uses adaptive Fourier neural operators to accelerate global weather forecasting by up to 80,000 times faster than traditional methods.
  • It leverages a transformer-based architecture and extensive ERA5 reanalysis data to improve medium-range forecast resolution by eight-fold while maintaining accuracy.
  • The model's scalability on high-performance computing platforms promises transformative impacts on climate modeling, digital twin applications, and extreme weather prediction.

FourCastNet: Accelerating Global High-Resolution Weather Forecasting using Adaptive Fourier Neural Operators

The paper explores the capabilities of FourCastNet, a deep learning-based Earth system emulator that advances global weather forecasting by employing Adaptive Fourier Neural Operators (AFNOs). The authors address the limitations inherent in physics-based numerical weather prediction (NWP) models, which are plagued by high computational costs and time-to-solution constraints, and propose a data-driven approach akin to deep learning. FourCastNet achieves global weather prediction with medium-range forecasts at a staggering five orders of magnitude faster than traditional NWP methods, approximating state-of-the-art accuracy.

FourCastNet demonstrates scalability across several high-performance computing platforms, including Selene, Perlmutter, and JUWELS Booster, utilizing up to 3,808 NVIDIA A100 GPUs. Remarkably, the model achieves 140.8 petaFLOPS in mixed precision, with an 80,000-fold reduction in time-to-solution for training compared to contemporary NWP techniques. Specifically, the paper highlights a training time of 67.4 minutes on 3,072 GPUs, underscoring its remarkable efficiency.

The paper outlines the critical challenges faced by traditional NWP models, which include handling complexity, resolution, and dimensionality of atmospheric interactions, along with ensuring sufficient ensemble sizes, scenario diversity, throughput, scalability, and interactivity. The introduction of AFNO, a transformer architecture adept at handling continuous high-resolution data inputs via Fourier neural operators, represents a significant step forward. The transformer-based AFNO architecture harnesses self-attention mechanisms tailored to model long-range dependencies and non-local features efficiently.

Training and inference using FourCastNet benefit from an extensive dataset, specifically the ERA5 reanalysis dataset from ECMWF, which supports high-resolution data modeling. FourCastNet specifically outpaces existing DL-based global weather surrogates, offering eight-fold resolution improvements and significant accuracy in capturing detailed phenomena such as atmospheric rivers and tropical cyclones.

Practically, FourCastNet indicates substantial implications across weather and climate science, future supercomputing systems design, and societal applications. By achieving accelerated throughput and addressing the imminent data avalanche challenges, FourCastNet marks a departure from conventional methodologies, emphasizing scalability, and interactivity. In theory, the scalable model parallelism observed in FourCastNet holds promise for achieving kilometer-scale emulation, which could redefine precision in weather predictions and foster confident forecasts of extreme weather events through extensive ensembles.

Beyond its scientific implications, FourCastNet presents transformative potential for simulation methodologies and computing architectures, encouraging integration for digital twin applications, interactivity at scale, and broader societal decision-making frameworks. The insights offered by this model showcase a paradigm shift towards data-driven forecasting strategies, encouraging future advancements in AI that will further bolster climate action and sustainable interventions. Future developments will likely integrate physics-informed deep learning models to overcome the extrapolation challenges of unprecedented futures, propelled by the rapid evolution in the field of scientific machine learning.

Youtube Logo Streamline Icon: https://streamlinehq.com