Prithvi WxC: Foundation Model for Weather and Climate
Johannes Schmude, Sujit Roy, Will Trojak, Johannes Jakubik, Daniel Salles Civitarese, Shraddha Singh, Julian Kuehnert, Kumar Ankur, Aman Gupta, Christopher E Phillips, Romeo Kienzler, Daniela Szwarcman, Vishal Gaur, Rajat Shinde, Rohit Lal, Arlindo Da Silva, Jorge Luis Guevara Diaz, Anne Jones, Simon Pfreundschuh, Amy Lin, Aditi Sheshadri, Udaysankar Nair
2024-09-23

Summary
This paper introduces Prithvi WxC, a powerful AI model designed to improve weather and climate predictions. It uses a large amount of data to help forecast weather patterns more accurately and efficiently than traditional methods.
What's the problem?
Traditional weather prediction models can be slow and require a lot of computing power, making them less effective for quick forecasts. Many existing AI models focus on specific tasks and do not work well across different weather-related applications. This limits their usefulness in understanding complex climate phenomena.
What's the solution?
To address these issues, the researchers developed Prithvi WxC, a foundation model with 2.3 billion parameters that uses 160 different variables from historical weather data (MERRA-2). The model employs a special architecture that captures both local and global weather patterns, allowing it to make accurate predictions across various scenarios. It also combines different training techniques to improve its forecasting abilities. The model has been tested on several challenging tasks related to weather forecasting and is available as an open-source resource for others to use.
Why it matters?
This research is significant because it enhances our ability to predict weather and climate changes, which is crucial for planning and responding to environmental challenges. By making this advanced model publicly available, it encourages further research and innovation in the field of climate science, ultimately helping society better prepare for extreme weather events.
Abstract
Triggered by the realization that AI emulators can rival the performance of traditional numerical weather prediction models running on HPC systems, there is now an increasing number of large AI models that address use cases such as forecasting, downscaling, or nowcasting. While the parallel developments in the AI literature focus on foundation models -- models that can be effectively tuned to address multiple, different use cases -- the developments on the weather and climate side largely focus on single-use cases with particular emphasis on mid-range forecasting. We close this gap by introducing Prithvi WxC, a 2.3 billion parameter foundation model developed using 160 variables from the Modern-Era Retrospective Analysis for Research and Applications, Version 2 (MERRA-2). Prithvi WxC employs an encoder-decoder-based architecture, incorporating concepts from various recent transformer models to effectively capture both regional and global dependencies in the input data. The model has been designed to accommodate large token counts to model weather phenomena in different topologies at fine resolutions. Furthermore, it is trained with a mixed objective that combines the paradigms of masked reconstruction with forecasting. We test the model on a set of challenging downstream tasks namely: Autoregressive rollout forecasting, Downscaling, Gravity wave flux parameterization, and Extreme events estimation. The pretrained model with 2.3 billion parameters, along with the associated fine-tuning workflows, has been publicly released as an open-source contribution via Hugging Face.