TimeMRA: LLM-Empowered Time Series Forecasting via Multi-Scale Retrieval-Augmented Representations
Abstract
Time series forecasting plays a pivotal role in data-driven decision-making across various time series domains. Recently, leveraging their ability to extract semantically rich representations, Large Language Models (LLMs) have achieved promising results in time series forecasting. However, existing LLM-based methods struggle to obtain multi-scale retrieval-augmented representations due to entangled multi-scale representations and redundant multi-scale interference. To address this, we propose TimeMRA, an LLM-empowered Time series forecasting framework via Multi-Scale Retrieval-Augmented representations. Specifically, a scale-aware prompt generation (SAPG) module is designed to decompose time series into multiple scales and generate augmented multi-scale representations. Then, a cross-scale disentanglement constraint (CSDC) mechanism with a router network is designed to obtain the disentangled multi-scale semantic representations while mitigating interference from irrelevant scales. Finally, a cross-modality retrieval module is designed to obtain multi-scale retrieval-augmented representations for time series forecasting. Experiments on 10 real-world datasets demonstrate that TimeMRA achieves state-of-the-art (SOTA) performance.