Poster
Deep Factors for Forecasting
Yuyang Wang · Alex Smola · Danielle Robinson · Jan Gasthaus · Dean Foster · Tim Januschowski

Tue Jun 11th 06:30 -- 09:00 PM @ Pacific Ballroom #254

Producing probabilistic forecasts for large collections of similar and/or dependent time series is a practically highly relevant, yet challenging task. Classical time series models fail to capture complex patterns in the data and multivariate techniques struggle to scale to large problem sizes, but their reliance on strong structural assumptions makes them data-efficient and allows them to provide estimates of uncertainty. The converse is true for models based on deep neural networks, which can learn complex patterns and dependencies given enough data. In this paper, we propose a hybrid model that incorporates the benefits of both approaches. Our new method is data-driven and scalable via a latent, global, deep component. It also handles uncertainty through a local classical model. We provide both theoretical and empirical evidence for the soundness of our approach through a necessary and sufficient decomposition of exchangeable time series into a global and a local part and extensive experiments. Our experiments demonstrate the advantages of our model both in term of data efficiency and computational complexity.

Author Information

Bernie Wang (AWS AI Labs)
Alex Smola (Amazon)
Danielle Robinson (Amazon Web Services)
Jan Gasthaus (Amazon Research)
Dean Foster (Amazon)
Tim Januschowski (Amazon Research)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors