Timezone: »

 
Poster
Transformer Hawkes Process
Simiao Zuo · Haoming Jiang · Zichong Li · Tuo Zhao · Hongyuan Zha

Thu Jul 16 05:00 PM -- 05:45 PM & Fri Jul 17 04:00 AM -- 04:45 AM (PDT) @ None #None

Modern data acquisition routinely produce massive amounts of event sequence data in various domains, such as social media, healthcare, and financial markets. These data often exhibit complicated short-term and long-term temporal dependencies. However, most of the existing recurrent neural network based point process models fail to capture such dependencies, and yield unreliable prediction performance. To address this issue, we propose a Transformer Hawkes Process (THP) model, which leverages the self-attention mechanism to capture long-term dependencies and meanwhile enjoys computational efficiency. Numerical experiments on various datasets show that THP outperforms existing models in terms of both likelihood and event prediction accuracy by a notable margin. Moreover, THP is quite general and can incorporate additional structural knowledge. We provide a concrete example, where THP achieves improved prediction performance for learning multiple point processes when incorporating their relational information.

Author Information

Simiao Zuo (Georgia Institute of Technology)
Haoming Jiang (Georgia Tech)
Zichong Li (University of Science and technology of China)
Tuo Zhao (Georgia Tech)
Hongyuan Zha (Georgia Institute of Technology)

More from the Same Authors