Oral
Imputing Missing Events in Continuous-Time Event Streams
Hongyuan Mei · Guanghui Qin · Jason Eisner

Tue Jun 11th 03:05 -- 03:10 PM @ Room 201

Events that we observe in the world may be caused by other, unobserved events. We consider sequences of discrete events in continuous time. Given a probability model of complete sequences, we propose particle smoothing---a form of sequential importance sampling---to impute the missing events in an incomplete sequence. We develop a trainable family of proposal distributions based on a type of continuous-time bidirectional LSTM. Thus, unlike in particle filtering, our proposed events are conditioned on the future and not just on the past. Our method can sample an ensemble of possible complete sequences (particles), from which we form a single consensus prediction that has low Bayes risk under our chosen loss metric. We experiment in multiple synthetic and real domains, using different missingness mechanisms, and modeling the complete sequences in each domain with a neural Hawkes process (Mei & Eisner 2017). On held-out incomplete sequences, our method is effective at inferring the ground truth unobserved events. In particular, particle smoothing consistently improves upon particle filtering, showing the benefit of training a bidirectional proposal distribution. We further use multinomial resampling to mitigate the particle skewness problem, which further improves results.

Author Information

Hongyuan Mei (Johns Hopkins University)
Guanghui Qin (Peking University)
Jason Eisner (Johns Hopkins University + Microsoft Semantic Machines)

[http://cs.jhu.edu/~jason/bio.html]

Related Events (a corresponding poster, oral, or spotlight)