Towards Foundation Models for Zero-Shot Time Series Anomaly Detection: Leveraging Synthetic Data and Relative Context Discrepancy
Abstract
TSAD is a critical task, but developing models that generalize to unseen data in a zero-shot manner remains a major challenge. Prevailing foundation models for TSAD predominantly rely on reconstruction-based objectives, which suffer from a fundamental objective mismatch and representation conflict: they tend to memorize static patterns from training data, struggling to identify subtle anomalies while often misinterpreting complex normal patterns in unseen domains. To overcome these limitations, we introduce TimeRCD, a novel foundation model for TSAD built upon a new pre-training paradigm: Relative Context Discrepancy (RCD). Instead of reconstructing inputs based on fixed priors, TimeRCD is explicitly trained to adaptively identify anomalies by contrasting the query with its surrounding context. This relational approach, implemented with a standard Transformer architecture, enables the model to infer normality on-the-fly and capture contextual shifts indicative of anomalies that reconstruction-based methods often miss. To empower this paradigm, we develop a large-scale, diverse synthetic corpus with context-dependent anomaly labels, providing the rich supervisory signal necessary for effective pre-training. Extensive experiments demonstrate that TimeRCD significantly outperforms existing general-purpose and anomaly-specific foundation models in zero-shot TSAD across diverse datasets. Our results validate the superiority of the RCD paradigm and establish a new, effective path toward building robust and generalizable foundation models for time series anomaly detection. The code is available in \url{https://anonymous.4open.science/r/TimeRCD-5BE1/}