Omni-Weather: Unified Multimodal Foundation Model for Weather Generation and Understanding
Zhiwang Zhou, Yuandong Pu, Xuming He, Yidi Liu, Yixin Chen, Junchao Gong, Xiang Zhuang, Wanghan Xu, Qinglong Cao, Shixiang Tang, Yihao Liu, Wenlong Zhang, Lei Bai
2025-12-29
Summary
This paper introduces Omni-Weather, a new artificial intelligence model designed to both predict the weather and explain *why* the weather is happening, all in one system.
What's the problem?
Currently, weather prediction and understanding the reasons behind the weather are done separately. Models are good at either forecasting what will happen, or explaining past events, but not both at the same time. This means we can get a prediction, but not necessarily understand the underlying causes, or vice versa.
What's the solution?
The researchers created Omni-Weather, which uses a system that first 'looks' at radar data to generate a weather forecast. Then, it uses a shared processing system to both create the forecast and explain the reasoning behind it. They also created a special dataset to help the model learn to think through weather patterns logically, like a chain of thought. This allows the model to give more understandable and accurate predictions.
Why it matters?
This is important because combining prediction and understanding can lead to better and more reliable weather forecasts. If we understand *why* a storm is forming, we can predict its path and intensity more accurately. It also shows that building AI that can both generate information and explain its reasoning is a valuable approach in complex fields like weather science.
Abstract
Weather modeling requires both accurate prediction and mechanistic interpretation, yet existing methods treat these goals in isolation, separating generation from understanding. To address this gap, we present Omni-Weather, the first multimodal foundation model that unifies weather generation and understanding within a single architecture. Omni-Weather integrates a radar encoder for weather generation tasks, followed by unified processing using a shared self-attention mechanism. Moreover, we construct a Chain-of-Thought dataset for causal reasoning in weather generation, enabling interpretable outputs and improved perceptual quality. Extensive experiments show Omni-Weather achieves state-of-the-art performance in both weather generation and understanding. Our findings further indicate that generative and understanding tasks in the weather domain can mutually enhance each other. Omni-Weather also demonstrates the feasibility and value of unifying weather generation and understanding.