Skip to yearly menu bar Skip to main content


Spotlight

End-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series

Syama Sundar Yadav Rangapuram · Lucien Werner · Konstantinos Benidis · Pedro Mercado · Jan Gasthaus · Tim Januschowski

Abstract:

This paper presents a novel approach for hierarchical time series forecasting that produces coherent, probabilistic forecasts without requiring any explicit post-processing reconciliation. Unlike the state-of-the-art, the proposed method simultaneously learns from all time series in the hierarchy and incorporates the reconciliation step into a single trainable model. This is achieved by applying the reparameterization trick and casting reconciliation as an optimization problem with a closed-form solution. These model features make end-to-end learning of hierarchical forecasts possible, while accomplishing the challenging task of generating forecasts that are both probabilistic and coherent. Importantly, our approach also accommodates general aggregation constraints including grouped and temporal hierarchies. An extensive empirical evaluation on real-world hierarchical datasets demonstrates the advantages of the proposed approach over the state-of-the-art.

Chat is not available.