Timezone: »
Sequential VAEs have been successfully considered for many high-dimensional time series modelling problems, with many variant models relying on discrete-time mechanisms such as recurrent neural networks (RNNs). On the other hand, continuous-time methods have recently gained attraction, especially in the context of irregularly-sampled time series, where they can better handle the data than discrete-time methods. One such class are Gaussian process variational autoencoders (GPVAEs), where the VAE prior is set as a Gaussian process (GP). However, a major limitation of GPVAEs is that it inherits the cubic computational cost as GPs, making it unattractive to practioners. In this work, we leverage the equivalent discrete state space representation of Markovian GPs to enable linear time GPVAE training via Kalman filtering and smoothing. For our model, Markovian GPVAE (MGPVAE), we show on a variety of high-dimensional temporal and spatiotemporal tasks that our method performs favourably compared to existing approaches whilst being computationally highly scalable.
Author Information
Harrison Zhu (Imperial College London)
Carles Balsells Rodas (Imperial College London)
Yingzhen Li (Imperial College London)
More from the Same Authors
-
2022 : Markovian Gaussian Process Autoencoders »
Harrison Zhu · Carles Balsells Rodas · Yingzhen Li -
2023 : On the Identifiability of Markov Switching Models »
Carles Balsells Rodas · Yixin Wang · Yingzhen Li -
2023 : Training Discrete EBMs with Energy Discrepancy »
Tobias Schröder · Zijing Ou · Yingzhen Li · Andrew Duncan -
2021 : Invited Talk 2 (Yingzhen Li): Inference with scores: slices, diffusions and flows »
Yingzhen Li -
2021 Poster: Active Slices for Sliced Stein Discrepancy »
Wenbo Gong · Kaibo Zhang · Yingzhen Li · Jose Miguel Hernandez-Lobato -
2021 Spotlight: Active Slices for Sliced Stein Discrepancy »
Wenbo Gong · Kaibo Zhang · Yingzhen Li · Jose Miguel Hernandez-Lobato -
2021 : Invited Talk #1 - Evaluating approximate inference for BNNs »
Yingzhen Li