Skip to yearly menu bar Skip to main content


Poster

Context Consistency Regularization for Label Sparsity in Time Series

Yooju Shin · Susik Yoon · Hwanjun Song · Dongmin Park · Byunghyun Kim · Jae-Gil Lee · Byung Suk Lee

Exhibit Hall 1 #503
[ ]
[ PDF [ Poster

Abstract:

Labels are typically sparse in real-world time series due to the high annotation cost. Recently, consistency regularization techniques have been used to generate artificial labels from unlabeled augmented instances. To fully exploit the sequential characteristic of time series in consistency regularization, we propose a novel method of data augmentation called context-attached augmentation, which adds preceding and succeeding instances to a target instance to form its augmented instance. Unlike the existing augmentation techniques that modify a target instance by directly perturbing its attributes, the context-attached augmentation generates instances augmented with varying contexts while maintaining the target instance. Based on our augmentation method, we propose a context consistency regularization framework, which first adds different contexts to a target instance sampled from a given time series and then shares unitary reliability-based cross-window labels across the augmented instances to maintain consistency. We demonstrate that the proposed framework outperforms the existing state-of-the-art consistency regularization frameworks through comprehensive experiments on real-world time-series datasets.

Chat is not available.