Timezone: »
Recently, denoising diffusion models have led to significant breakthroughs in the generation of images, audio and text. However, it is still an open question on how to adapt their strong modeling ability to model time series. In this paper, we propose TimeDiff, a non-autoregressive diffusion model that achieves high-quality time series prediction with the introduction of two novel conditioning mechanisms: future mixup and autoregressive initialization. Similar to teacher forcing, future mixup allows parts of the ground-truth future predictions for conditioning, while autoregressive initialization helps better initialize the model with basic time series patterns such as short-term trends. Extensive experiments are performed on nine real-world datasets. Results show that TimeDiff consistently outperforms existing time series diffusion models, and also achieves the best overall performance across a variety of the existing strong baselines (including transformers and FiLM).
Author Information
Lifeng Shen (Hong Kong University of Science and Technology)
James Kwok (Hong Kong University of Science and Technology)
More from the Same Authors
-
2023 Poster: Effective Structured Prompting by Meta-Learning and Representative Verbalizer »
Weisen Jiang · Yu Zhang · James Kwok -
2023 Poster: Nonparametric Iterative Machine Teaching »
CHEN ZHANG · Xiaofeng Cao · Weiyang Liu · Ivor Tsang · James Kwok -
2022 Poster: Subspace Learning for Effective Meta-Learning »
Weisen Jiang · James Kwok · Yu Zhang -
2022 Spotlight: Subspace Learning for Effective Meta-Learning »
Weisen Jiang · James Kwok · Yu Zhang -
2022 Poster: Efficient Variance Reduction for Meta-learning »
Hansi Yang · James Kwok -
2022 Spotlight: Efficient Variance Reduction for Meta-learning »
Hansi Yang · James Kwok -
2021 Poster: SparseBERT: Rethinking the Importance Analysis in Self-attention »
Han Shi · Jiahui Gao · Xiaozhe Ren · Hang Xu · Xiaodan Liang · Zhenguo Li · James Kwok -
2021 Spotlight: SparseBERT: Rethinking the Importance Analysis in Self-attention »
Han Shi · Jiahui Gao · Xiaozhe Ren · Hang Xu · Xiaodan Liang · Zhenguo Li · James Kwok -
2020 Poster: Searching to Exploit Memorization Effect in Learning with Noisy Labels »
QUANMING YAO · Hansi Yang · Bo Han · Gang Niu · James Kwok -
2019 Poster: Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations »
Quanming Yao · James Kwok · Bo Han -
2019 Oral: Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations »
Quanming Yao · James Kwok · Bo Han -
2018 Poster: Online Convolutional Sparse Coding with Sample-Dependent Dictionary »
Yaqing WANG · Quanming Yao · James Kwok · Lionel NI -
2018 Poster: Lightweight Stochastic Optimization for Minimizing Finite Sums with Infinite Data »
Shuai Zheng · James Kwok -
2018 Oral: Lightweight Stochastic Optimization for Minimizing Finite Sums with Infinite Data »
Shuai Zheng · James Kwok -
2018 Oral: Online Convolutional Sparse Coding with Sample-Dependent Dictionary »
Yaqing WANG · Quanming Yao · James Kwok · Lionel NI -
2017 Poster: Follow the Moving Leader in Deep Learning »
Shuai Zheng · James Kwok -
2017 Talk: Follow the Moving Leader in Deep Learning »
Shuai Zheng · James Kwok