Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Next Generation of Sequence Modeling Architectures

FutureTST: When Transformers Meet Future Exogenous Drivers

Kshitij Tayal · Arvind Renganathan · Vipin Kumar · Dan Lu


Abstract:

Previous forecasting approaches often overlook covariate, or "exogenous", information crucial for predicting the target, or "endogenous", variable. Recent transformer-based models tackle this by treating forecasting as a multivariate problem, forecasting both covariate and target variables as joint functions of their pasts and treating all variables equally. However, these methods typically fail to incorporate future exogenous information, which is often available and important for accurate forecasting. To address this limitation, we introduce FutureTST, a novel transformer framework designed to leverage future exogenous inputs for improved forecasting of endogenous variables in real-world systems. Our framework employs patch-wise self-attention to discern temporal patterns in the endogenous variable and variate-wise cross-attention to integrate the influence of exogenous data, thereby enhancing the model's ability to assimilate and utilize external information. This dual attention mechanism enables FutureTST to dynamically adapt to both target and covariate series, significantly enhancing forecast accuracy. Extensive experiments across multiple real-world datasets demonstrate that FutureTST outperforms extensive existing forecasting approaches by 10%, highlighting its effectiveness in using external information for improved time-series predictions.

Chat is not available.