Time-CoT: Hierarchical Reasoning with Temporal Semantic Codes for Multivariate Time Series Classification
Abstract
Integrating Large Language Models (LLMs) into time series tasks has yielded impressive performance. While some works aim to enhance accuracy by explicitly designing step-by-step reasoning into prompts, such explicit Chain-of-Thought (CoT) approaches are difficult to generalize to time series. This is because it is difficult to clearly define the reasoning trajectories of time series. In addition, the high heterogeneity across time series often requires specialized prompt designs, limiting the model's scalability. To address these challenges, we propose Time-CoT (Time Series Chain-of-Thought), a hierarchical reasoning framework based on temporal semantic codes for multivariate time series classification. This framework automatically constructs scenario-specific reasoning trajectories based on the characteristics of time series, thereby better eliciting the LLM's reasoning capability for time-series data. Specifically, Time-CoT, we first perform temporal representation pre-training with a multi-view temporal representation fusion to acquire high-quality temporal embeddings. We then discretize these temporal embeddings into hierarchical temporal semantic codes as the reasoning trajectory. Finally, the LLM predicts temporal semantic codes in a stepwise manner and then infers the final labels, thereby establishing a coarse-to-fine decision process. Experiments on ten public multivariate time series datasets demonstrate that the Time-CoT effectively adapts to diverse datasets and outperforms state-of-the-art methods. Our code is available at .