See More, Forecast Better and Faster: Enhancing Time Series Foundation Models via Inference-Time Plug-and-Play Downsampling
Longlong Xu ⋅ Zeyan Li ⋅ Xiao He ⋅ Zhaoyang Yu ⋅ Dazhong Wen ⋅ Mingze Sun ⋅ Changhua Pei ⋅ Dan Pei
Abstract
Time series foundation models (TSFMs) have demonstrated impressive generalization capabilities across diverse domains. However, they face significant challenges in long-term and ultra long-term forecasting. These challenges primarily arise from scalability limitations when TSFMs process extensive sequence lengths. To address this, we propose SPRINT, a training-free plug-and-play framework designed to empower TSFMs to see more, forecast better and faster during inference. The core idea is to perform forecasting in a downsampled-resolution space, enabling an extended look-back window with reduced computational costs. To avoid information loss and resolution mismatch caused by downsampling, SPRINT decomposes time series into trend and seasonal components, processing them separately. It predicts the low-frequency trend via a Resolution Interpolation workflow within the downsampled space, while preserving high-frequency details through a Pattern Replication mechanism for seasonality. Extensive experiments show that SPRINT achieves a significant improvement, increasing accuracy by 19\% while enhancing efficiency with a reduction of max memory usage by 6.4$\times$ and inference time by 16.9$\times$ compared to state-of-the-art TSFMs.
Successful Page Load