Skip to yearly menu bar Skip to main content


Poster

Transformer Hawkes Process

Simiao Zuo · Haoming Jiang · Zichong Li · Tuo Zhao · Hongyuan Zha

Keywords: [ Computational Social Sciences ] [ Time Series and Sequence Models ] [ Applications - Other ]


Abstract:

Modern data acquisition routinely produce massive amounts of event sequence data in various domains, such as social media, healthcare, and financial markets. These data often exhibit complicated short-term and long-term temporal dependencies. However, most of the existing recurrent neural network based point process models fail to capture such dependencies, and yield unreliable prediction performance. To address this issue, we propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies and meanwhile enjoys computational efficiency. Numerical experiments on various datasets show that THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin. Moreover, THP is quite general and can incorporate additional structural knowledge. We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.

Chat is not available.