Self-Supervised Dynamical System Representations for Physiological Time-Series
Yenho Chen ⋅ Maxwell Xu ⋅ James Rehg ⋅ Christopher Rozell
Abstract
Self-supervised learning for physiological time-series aims to captures the identity of the underlying dynamical process while filtering irrelevant noise. However, existing approaches may obscure the clinical semantics important for downstream transferability. Weakly constrained pretext tasks (i.e. contrastive learning, MAE) may incorrectly ignore the underlying dynamical structure, while structurally constrained models (i.e. SVAEs) are unable to selectively filter sample-specific noise. To bridge this gap, we propose ${\bf PULSE}$, a novel pretraining objective that simultaneously preserves dynamical relationships important to physiological time-series while selectively removing irrelevant noise. We achieve this by formulating a dynamical systems model to identify transferable and non-transferable information between time-series windows, and target the former through a novel cross-reconstruction objective. We establish theory that provides conditions for when transferrable information is recovered, and empirically validate it through synthetic experiments. On several real-world datasets, PULSE effectively distinguishes clinical semantic classes, increases label efficiency, and improves transfer learning performance.
Successful Page Load