Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ORBIT-2: Scaling Exascale Vision Foundation Models for Weather and Climate Downscaling (2505.04802v1)

Published 7 May 2025 in cs.LG, astro-ph.EP, cs.AI, cs.DC, and physics.ao-ph

Abstract: Sparse observations and coarse-resolution climate models limit effective regional decision-making, underscoring the need for robust downscaling. However, existing AI methods struggle with generalization across variables and geographies and are constrained by the quadratic complexity of Vision Transformer (ViT) self-attention. We introduce ORBIT-2, a scalable foundation model for global, hyper-resolution climate downscaling. ORBIT-2 incorporates two key innovations: (1) Residual Slim ViT (Reslim), a lightweight architecture with residual learning and Bayesian regularization for efficient, robust prediction; and (2) TILES, a tile-wise sequence scaling algorithm that reduces self-attention complexity from quadratic to linear, enabling long-sequence processing and massive parallelism. ORBIT-2 scales to 10 billion parameters across 32,768 GPUs, achieving up to 1.8 ExaFLOPS sustained throughput and 92-98% strong scaling efficiency. It supports downscaling to 0.9 km global resolution and processes sequences up to 4.2 billion tokens. On 7 km resolution benchmarks, ORBIT-2 achieves high accuracy with R2 scores in the range of 0.98 to 0.99 against observation data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (16)
  1. Xiao Wang (507 papers)
  2. Jong-Youl Choi (2 papers)
  3. Takuya Kurihaya (1 paper)
  4. Isaac Lyngaas (8 papers)
  5. Hong-Jun Yoon (3 papers)
  6. Ming Fan (32 papers)
  7. Nasik Muhammad Nafi (1 paper)
  8. Aristeidis Tsaris (16 papers)
  9. Ashwin M. Aji (3 papers)
  10. Maliha Hossain (5 papers)
  11. Mohamed Wahib (38 papers)
  12. Dali Wang (12 papers)
  13. Peter Thornton (2 papers)
  14. Prasanna Balaprakash (92 papers)
  15. Moetasim Ashfaq (3 papers)
  16. Dan Lu (30 papers)

Summary

ORBIT-2: Scaling Exascale Vision Foundation Models for Weather and Climate Downscaling

The paper introduces ORBIT-2, an innovative foundation model specifically designed for climate and weather downscaling at hyper-resolution scales. This is a critical advancement in the domain of environmental modeling, addressing prevalent limitations in existing AI approaches that struggle with generalization across diverse geographical and meteorological variables. The objectives of ORBIT-2 are not only computationally feasible but also scientifically significant given the current demands for precise climate projections necessary for regional planning and adaptation strategies.

Architectural Innovations

The primary architectural contribution of ORBIT-2 resides in its Residual Slim Vision Transformer (Reslim), which integrates residual learning and Bayesian regularization to optimize prediction efficiency without compromising the robustness of results. By eschewing conventional input upsampling, this architecture significantly reduces sequence length and computational requirements, marking a notable advancement in model efficiency.

The Reslim design is complemented by the introduction of TILES, a tile-wise sequence scaling algorithm that transforms the traditionally quadratic complexity of Vision Transformer self-attention into a linear process. This is achieved by segmenting inputs into overlapping tiles for localized self-attention processing, which supports scalable long-sequence handling and massive parallelism across graphical processing units (GPUs).

Numerical Strength and Claims

ORBIT-2 achieves unprecedented computational scale, supporting models up to 10 billion parameters and utilizing 32,768 GPUs. It sets a benchmark in sustained throughput by achieving up to 1.8 ExaFLOPS with 92–98% strong scaling efficiency. On 7 km resolution benchmarks over the continental United States, ORBIT-2 attained impressive R2R^2 scores ranging from 0.98 to 0.99, indicating high accuracy against observational datasets for both precipitation and temperature metrics.

Theoretical and Practical Implications

Theoretically, ORBIT-2 bridges significant gaps in the field by facilitating more refined, hyper-resolution global climate modeling. The implications for Earth system science are profound, offering enhanced capabilities for various climates and weather-based studies. This also empowers sectors reliant on accurate forecasts, such as agriculture, water management, and urban infrastructure planning.

Practically, these improvements in performance and accuracy could lead to increased reliability in early warning systems and more effective disaster risk mitigation. Furthermore, by effectively scaling AI-driven models at an exascale level, ORBIT-2 sets the groundwork for future advancements in scientific computing, demonstrating potential applicability in other fields requiring large-scale spatiotemporal modeling.

Speculation on Future Developments

Looking forward, the adoption of ORBIT-2 could incentivize similar design strategies focusing on reducing computational complexity while maintaining model fidelity. Its success may promote further exploration into scalable algorithms within AI, expanding prospects in areas like genomics, fluid dynamics, and astrophysics where complex data management and high accuracy are equally pivotal.

In summary, ORBIT-2 represents a significant advancement in climate modeling, providing a foundation model poised to enable high-resolution climate projections and robust generalization across geographic and meteorological variables. This paper contributes valuable insights toward scalable AI applications within computational science, marking a step forward in leveraging exascale computing capabilities for tackling complex global challenges.

Youtube Logo Streamline Icon: https://streamlinehq.com