Title
Transformer-enhanced Hawkes process with decoupling training for information cascade prediction
Abstract
The ability to model the information diffusion process and predict its size is crucial to understanding information propagation mechanism and is useful for many applications such as popularity prediction and fake news detection. Recent research works have attempted to address the problem of information cascade prediction using two basic paradigms: (1) sequential methods, e.g., recurrent neural networks (RNNs), and (2) graph learning techniques to retain the topological information and enable consideration of structural relationships among diffusion participants. However, existing models consider the topological and temporal features separately, falling short of simulating their entanglement in the diffusion process. As a consequence, since the evolving directed acyclic graph (DAG) of information diffusion is intrinsically coupled with both topological and temporal dependencies, there is a loss of cross-domain information. In this paper, we propose a transformer enhanced Hawkes process (Hawkesformer), which links the hierarchical attention mechanism to Hawkes self-exciting point process for information cascade prediction. Specifically, we extend traditional Hawkes process with a topological horizon and efficiently acquire knowledge from continuous-time domain. A two-level attention architecture is used to parameterize the intensity function of Hawkesformer. At the first-level, we disentangle the primary and non-primary paths to simulate the coupled topological and temporal information for capturing the global dependencies between the nodes in a graph. At the second-level, a local pooling attentive module is proposed to embed the cascade evolution rate for modeling short-term outbreaks. Extensive experiments on two real-world datasets demonstrate the significant performance improvements of Hawkesformer over existing state-of-the-art models.
Year
DOI
Venue
2022
10.1016/j.knosys.2022.109740
Knowledge-Based Systems
Keywords
DocType
Volume
Information diffusion,Information cascade prediction,Hawkes point process,Attention mechanism,Popularity prediction
Journal
255
ISSN
Citations 
PageRank 
0950-7051
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Yu Liu119019.09
Xu Xovee2105.61
Goce Trajcevski31732141.26
Fan Zhou410123.20