KUMA: A Novel Framework with Koopman Separation and Efficient Multilevel Extraction in Time Series Forecasting
Abstract
Time series forecasting plays a crucial role in a wide range of real-world applications and has become increasingly complex with the growth of multivariate dimensions and extended historical observations, leading to the prosperity of deep forecasting models. Previous models are hindered by three major challenges: high computational complexity, inefficient token utilization caused by redundancy and scarcity, and temporal distribution shifts resulting from non-stationary dynamics. Inspired by Koopman theory and the success of multilevel encoder–decoder architectures with skip connections, we design an input-dependent Koopman module to decompose time series into Koopman dynamics and residual dynamics. Building upon this formulation, we propose a U-shaped Multilevel Attention module (UMA) that integrates element-wise attention filtering and linear attention, giving rise to KUMA. The input-dependent Koopman operator mitigates the issue of operator mixture and alleviates temporal distribution shifts, while UMA achieves a favorable balance between token redundancy and token scarcity with acceptable computational efficiency. Comprehensive evaluations across 12 benchmark datasets demonstrate that KUMA achieves superior performance compared to existing excellent approaches.