Skip to yearly menu bar Skip to main content


Poster

FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

Tian Zhou · Ziqing MA · Qingsong Wen · Xue Wang · Liang Sun · rong jin

Hall E #502

Keywords: [ DL: Attention Mechanisms ] [ APP: Time Series ] [ DL: Self-Supervised Learning ] [ DL: Theory ] [ DL: Algorithms ] [ DL: Sequential Models, Time series ]


Abstract:

Long-term time series forecasting is challenging since prediction accuracy tends to decrease dramatically with the increasing horizon. Although Transformer-based methods have significantly improved state-of-the-art results for long-term forecasting, they are not only computationally expensive but more importantly, are unable to capture the global view of time series (e.g. overall trend). To address these problems, we propose to combine Transformer with the seasonal-trend decomposition method, in which the decomposition method captures the global profile of time series while Transformers capture more detailed structures. To further enhance the performance of Transformer for long-term prediction, we exploit the fact that most time series tend to have a sparse representation in a well-known basis such as Fourier transform, and develop a frequency enhanced Transformer. Besides being more effective, the proposed method, termed as Frequency Enhanced Decomposed Transformer (FEDformer), is more efficient than standard Transformer with a linear complexity to the sequence length. Our empirical studies with six benchmark datasets show that compared with state-of-the-art methods, Fedformer can reduce prediction error by 14.8% and 22.6% for multivariate and univariate time series, respectively. Code is publicly available at https://github.com/MAZiqing/FEDformer.

Chat is not available.