Learning Long Range Spatio-Temporal Representations over Continuous Time Dynamic Graphs with State Space Models
Ayushman Raghuvanshi ⋅ Thummaluru Reddy ⋅ Sundeep Prabhakar Chepuri ⋅ Mahesh Chandran
Abstract
Continuous-time dynamic graphs (CTDGs) provide a richer framework to capture fine-grained temporal patterns in evolving relational data. Long-range information propagation is a key challenge while learning representations, wherein it is important to retain and update information over long temporal horizons. Existing approaches restrict models to capture one-hop or local temporal neighborhoods and fail to capture multi-hop or global structural patterns. To mitigate this, we derive a parameter-efficient state-space modeling framework for continuous-time dynamic graphs $\texttt{(CTDG-SSM)}$ from first principles. We first introduce continuous-time Topology-Aware higher order polynomial projection operator ($\texttt{CTT-HiPPO}$), a novel memory-based reformulation of $\texttt{HiPPO}$ to jointly encode temporal dynamics and graph structure. The solution from $\texttt{CTT-HiPPO}$ are obtained by projecting the classical HiPPO solution through a polynomial of the Laplacian matrix, yielding topology-aware memory updates that admit an equivalent state-space formulation for CTDGs ($\texttt{CTDG-SSM}$). Then a computationally efficient discrete formulation is obtained using the zero-order hold approach for model implementation. Across benchmarks on dynamic link prediction, dynamic node classification, and sequence classification, $\texttt{CTDG-SSM}$ achieves state-of-the-art performance. Notably, it achieves large performance gains on datasets that require long range temporal (LRT) and spatial reasoning.
Successful Page Load