Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

When Geoscience Meets Foundation Models: Towards General Geoscience Artificial Intelligence System (2309.06799v5)

Published 13 Sep 2023 in cs.AI and physics.geo-ph

Abstract: AI has significantly advanced Earth sciences, yet its full potential in to comprehensively modeling Earth's complex dynamics remains unrealized. Geoscience foundation models (GFMs) emerge as a paradigm-shifting solution, integrating extensive cross-disciplinary data to enhance the simulation and understanding of Earth system dynamics. These data-centric AI models extract insights from petabytes of structured and unstructured data, effectively addressing the complexities of Earth systems that traditional models struggle to capture. The unique strengths of GFMs include flexible task specification, diverse input-output capabilities, and multi-modal knowledge representation, enabling analyses that surpass those of individual data sources or traditional AI methods. This review not only highlights the key advantages of GFMs, but also presents essential techniques for their construction, with a focus on transformers, pre-training, and adaptation strategies. Subsequently, we examine recent advancements in GFMs, including LLMs, vision models, and vision-LLMs, particularly emphasizing the potential applications in remote sensing. Additionally, the review concludes with a comprehensive analysis of the challenges and future trends in GFMs, addressing five critical aspects: data integration, model complexity, uncertainty quantification, interdisciplinary collaboration, and concerns related to privacy, trust, and security. This review offers a comprehensive overview of emerging geoscientific research paradigms, emphasizing the untapped opportunities at the intersection of advanced AI techniques and geoscience. It examines major methodologies, showcases advances in large-scale models, and discusses the challenges and prospects that will shape the future landscape of GFMs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Hao Zhang (948 papers)
  2. Jin-Jian Xu (2 papers)
  3. Hong-Wei Cui (1 paper)
  4. Lin Li (329 papers)
  5. Yaowen Yang (3 papers)
  6. Chao-Sheng Tang (2 papers)
  7. Niklas Boers (40 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.