Title
Linking Transformer to Hawkes Process for Information Cascade Prediction (Student Abstract).
Abstract
Information cascade is typically formalized as a process of (simplified) discrete sequence of events, and recent approaches have tackled its prediction via variants of recurrent neural networks. However, the information diffusion process is essentially an evolving directed acyclic graph (DAG) in the continuous-time domain. In this paper, we propose a transformer enhanced Hawkes process (Hawkesformer), which links the hierarchical attention mechanism with Hawkes process to model the arrival stream of discrete events continuously. A two-level attention architecture is used to parameterize the intensity function of Hawkesformer, which captures the long-term dependencies between nodes in graph and better embeds the cascade evolution rate for modeling short-term outbreaks. Experimental results demonstrate the significant improvements of Hawkesformer over the state-of-the-art.
Year
Venue
Keywords
2022
AAAI Conference on Artificial Intelligence
Information Cascade,Hawkes Process,Attention Mechanism
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Liu Yu110.72
Xu Xovee2105.61
Zhong Ting34611.07
Goce Trajcevski41732141.26
Fan Zhou510123.20