Channel Adapter for Time Series Foundation Models in Zero-Shot Multivariate Forecasting
Abstract
Time Series Foundation Models (TSFMs) have achieved strong performance in univariate time series forecasting. However, most TSFMs rely on channel-independent pre-training that models each variable separately, limiting their ability to leverage inter-channel information that is crucial in real-world multivariate systems. Motivated by this limitation, we propose Chada, a lightweight plug-and-play channel adapter that allows frozen TSFMs to leverage multivariate correlations in a zero-shot setting. Chada first builds a budgeted pre-training dataset to cover diverse heterogeneous inter-channel dependency patterns. It then uses data-derived domain descriptors to learn a dataset-conditioned inter-channel similarity measure that reduces cross-domain metric distortion. Finally, it injects sparse inter-channel information via gated refinement, leveraging multivariate information without degrading intra-channel temporal dynamics. Extensive experiments on nine benchmarks validate the effectiveness of Chada, demonstrating consistent zero-shot improvements over four best-performing TSFMs while maintaining scalable deployment. Code is available at https://anonymous.4open.science/r/CHADA-A6.