Skip to yearly menu bar Skip to main content


Poster
in
Workshop: ICML 2024 Workshop on Foundation Models in the Wild

TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model

Defu Cao · Wen Ye · Yan Liu

Keywords: [ Foundation model; Time series; Generative model; Diffusion model ]


Abstract:

Time series modeling is critical for many real-world applications, but most existing approaches are task-specific. With the unique characteristics such as missing values, irregular sampling, multi-resolution and complex temporal dependencies, it is challenging to develop general foundation models for time series. In this paper, we introduce the Time Series Diffusion Transformer (TimeDiT) equipped with three distinct masking schemes designed to facilitate a uniform training and inference pipeline across various time series tasks. TimeDiT leverages the transformer architecture for capturing temporal dependencies and employs diffusion processes for generating high-quality candidate samples without stringent assumptions on the target distribution. Extensive experiments conducted on different datasets encompassing tasks such as forecasting, imputation, and anomaly detection demonstrate the model’s effectiveness. Both in-domain and zero-shot testing scenarios confirm the potential of our model to serve as a robust foundation model for multiple time series applications.

Chat is not available.