Poster
Modeling Irregular Time Series with Continuous Recurrent Units
Mona Schirmer · Mazin Eltayeb · Stefan Lessmann · Maja Rudolph
Hall E #419
Keywords: [ DL: Recurrent Networks ] [ PM: Bayesian Models and Methods ] [ T: Probabilistic Methods ] [ MISC: Supervised Learning ] [ MISC: Sequential, Network, and Time Series Modeling ] [ APP: Time Series ] [ APP: Health ] [ APP: Climate ] [ DL: Sequential Models, Time series ]
Recurrent neural networks (RNNs) are a popular choice for modeling sequential data. Modern RNN architectures assume constant time-intervals between observations. However, in many datasets (e.g. medical records) observation times are irregular and can carry important information. To address this challenge, we propose continuous recurrent units (CRUs) – a neural architecture that can naturally handle irregular intervals between observations. The CRU assumes a hidden state, which evolves according to a linear stochastic differential equation and is integrated into an encoder-decoder framework. The recursive computations of the CRU can be derived using the continuous-discrete Kalman filter and are in closed form. The resulting recurrent architecture has temporal continuity between hidden states and a gating mechanism that can optimally integrate noisy observations. We derive an efficient parameterization scheme for the CRU that leads to a fast implementation f-CRU. We empirically study the CRU on a number of challenging datasets and find that it can interpolate irregular time series better than methods based on neural ordinary differential equations.