Title
Continual Information Cascade Learning
Abstract
Modeling the information diffusion process is an essential step towards understanding the mechanisms driving the success of information. Existing methods either exploit various features associated with cascades to study the underlying factors governing information propagation, or leverage graph representation techniques to model the diffusion process in an end-to-end manner. Current solutions are only valid for a static and fixed observation scenario and fail to handle increasing observations due to the challenge of catastrophic forgetting problems inherent in the machine learning approaches used for modeling and predicting cascades. To remedy this issue, we propose a novel dynamic information diffusion model CICP (Continual Information Cascades Prediction). CICP employs graph neural networks for modeling information diffusion and continually adapts to increasing observations. It is capable of capturing the correlations between successive observations while preserving the important parameters regarding cascade evolution and transition. Experiments conducted on real-world cascade datasets demonstrate that our method not only improves the prediction performance with accumulated data but also prevents the model from forgetting previously trained tasks.
Year
DOI
Venue
2020
10.1109/GLOBECOM42002.2020.9322124
2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM)
Keywords
DocType
ISSN
information cascades, continual learning, catastrophic forgetting, graph neural networks, popularity prediction
Conference
2334-0983
Citations 
PageRank 
References 
0
0.34
0
Authors
6
Name
Order
Citations
PageRank
Fan Zhou13914.05
Xin Jing200.68
Xu Xovee3105.61
Ting Zhong4154.83
Goce Trajcevski531.04
Wu Jin662.46