U-Cast: A Surprisingly Simple Frontier Probabilistic AI Weather Forecaster
Salva Ruhling Cachay ⋅ Duncan Watson-Parris ⋅ Rose Yu
Abstract
Global weather forecasting has recently been revolutionized by AI, outperforming traditional physics-based ensembles. However, these state-of-the-art (SOTA) models rely on massive computational resources and increasingly specialized architectures, creating a high barrier to entry. In this work, we demonstrate that such complexity is not a prerequisite for SOTA performance. We introduce U-Cast, a streamlined probabilistic forecaster based on a standard U-Net, trained with Monte Carlo Dropout and the Muon optimizer. By leveraging a novel curriculum—deterministic pre-training followed by probabilistic fine-tuning on the Continuous Ranked Probability Score (CRPS)—our model achieves performance on par with or exceeding GenCast (e.g., up to 15\% CRPS improvement on short-range winds) while reducing training and/or inference compute by an order of magnitude compared to leading baselines. Our $1^\circ$ model trains in less than 15 H200 GPU-days and generates a 60-step forecast in just 12 seconds, suggesting a "Bitter Lesson" for AI weather forecasting: scalable, general-purpose architectures can outperform complex domain-specific designs.
Successful Page Load