Timezone: »
Multivariate time series data for real-world applications typically contain a significant amount of missing values. The dominant approach for classification with such missing values is to impute them heuristically with specific values (zero, mean, values of adjacent time-steps) or learnable parameters. However, these simple strategies do not take the data generative process into account, and more importantly, do not effectively capture the uncertainty in prediction due to the multiple possibilities for the missing values. In this paper, we propose a novel probabilistic framework for classification with multivariate time series data with missing values. Our model consists of two parts; a deep generative model for missing value imputation and a classifier. Extending the existing deep generative models to better capture structures of time-series data, our deep generative model part is trained to impute the missing values in multiple plausible ways, effectively modeling the uncertainty of the imputation. The classifier part takes the time series data along with the imputed missing values and classifies signals, and is trained to capture the predictive uncertainty due to the multiple possibilities of imputations. Importantly, we show that naïvely combining the generative model and the classifier could result in trivial solutions where the generative model does not produce meaningful imputations. To resolve this, we present a novel regularization technique that can promote the model to produce useful imputation values that help classification. Through extensive experiments on real-world time series data with missing values, we demonstrate the effectiveness of our method.
Author Information
SeungHyun Kim (KAIST)
Hyunsu Kim (KAIST)
EungGu Yun (Saige Research)
Hwangrae Lee (Samsung Electronics)
Jaehun Lee
Juho Lee (KAIST, AITRICS)
More from the Same Authors
-
2023 : Function Space Bayesian Pseudocoreset for Bayesian Neural Networks »
Balhae Kim · Hyungi Lee · Juho Lee -
2023 : Early Exiting for Accelerated Inference in Diffusion Models »
Taehong Moon · Moonseok Choi · EungGu Yun · Jongmin Yoon · Gayoung Lee · Juho Lee -
2023 : Towards Safe Self-Distillation of Internet-Scale Text-to-Image Diffusion Models »
Sanghyun Kim · Seohyeon Jung · Balhae Kim · Moonseok Choi · Jinwoo Shin · Juho Lee -
2023 Poster: Regularizing Towards Soft Equivariance Under Mixed Symmetries »
Hyunsu Kim · Hyungi Lee · Hongseok Yang · Juho Lee -
2023 Poster: Scalable Set Encoding with Universal Mini-Batch Consistency and Unbiased Full Set Gradient Approximation »
Jeffrey Willette · Seanie Lee · Bruno Andreis · Kenji Kawaguchi · Juho Lee · Sung Ju Hwang -
2023 Poster: Traversing Between Modes in Function Space for Fast Ensembling »
EungGu Yun · Hyungi Lee · Giung Nam · Juho Lee -
2022 Poster: Improving Ensemble Distillation With Weight Averaging and Diversifying Perturbation »
Giung Nam · Hyungi Lee · Byeongho Heo · Juho Lee -
2022 Poster: Set Based Stochastic Subsampling »
Bruno Andreis · Seanie Lee · A. Tuan Nguyen · Juho Lee · Eunho Yang · Sung Ju Hwang -
2022 Spotlight: Set Based Stochastic Subsampling »
Bruno Andreis · Seanie Lee · A. Tuan Nguyen · Juho Lee · Eunho Yang · Sung Ju Hwang -
2022 Spotlight: Improving Ensemble Distillation With Weight Averaging and Diversifying Perturbation »
Giung Nam · Hyungi Lee · Byeongho Heo · Juho Lee -
2021 Poster: Adversarial Purification with Score-based Generative Models »
Jongmin Yoon · Sung Ju Hwang · Juho Lee -
2021 Spotlight: Adversarial Purification with Score-based Generative Models »
Jongmin Yoon · Sung Ju Hwang · Juho Lee