Timezone: »
Analyzing multivariate time series data is important to predict future events and changes of complex systems in finance, manufacturing, and administrative decisions. The expressiveness power of Gaussian Process (GP) regression methods has been significantly improved by compositional covariance structures. In this paper, we present a new GP model which naturally handles multiple time series by placing an Indian Buffet Process (IBP) prior on the presence of shared kernels. Our selective covariance structure decomposition allows exploiting shared parameters over a set of multiple, selected time series. We also investigate the well-definedness of the models when infinite latent components are introduced. We present a pragmatic search algorithm which explores a larger structure space efficiently. Experiments conducted on five real-world data sets demonstrate that our new model outperforms existing methods in term of structure discoveries and predictive performances.
Author Information
Anh Tong (Ulsan National Institute of Science and Technology)
Jaesik Choi (Ulsan National Institute of Science and Technology)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Oral: Discovering Latent Covariance Structures for Multiple Time Series »
Wed Jun 12th 10:00 -- 10:05 PM Room Room 101
More from the Same Authors
-
2018 Poster: Deep Reinforcement Learning in Continuous Action Spaces: a Case Study in the Game of Simulated Curling »
kyowoon Lee · Sol-A Kim · Jaesik Choi · Seong-Whan Lee -
2018 Oral: Deep Reinforcement Learning in Continuous Action Spaces: a Case Study in the Game of Simulated Curling »
kyowoon Lee · Sol-A Kim · Jaesik Choi · Seong-Whan Lee