Title
IncreGNN: Incremental Graph Neural Network Learning by Considering Node and Parameter Importance
Abstract
Graph Neural Network (GNN) has shown powerful learning and reasoning ability. However, graphs in the real world generally exist dynamically, i.e., the topological structure of graphs is constantly evolving over time. On the one hand, the learning ability of the networks declines since the existing GNNs cannot process the graph streaming data. On the other hand, the cost of retraining GNNs from scratch becomes prohibitively high with the increasing scale of graph streaming data. Therefore, we propose an online incremental learning framework IncreGNN based on GNN in this paper, which solves the problem of high computational cost of retraining GNNs from scratch, and prevents catastrophic forgetting during incremental training. Specifically, we propose a sampling strategy based on node importance to reduce the amount of training data while preserving the historical knowledge. Then, we present a regularization strategy to avoid over-fitting caused by insufficient sampling. The experimental evaluations show the superiority of IncreGNN compared to existing GNNs in link prediction task.
Year
DOI
Venue
2022
10.1007/978-3-031-00123-9_59
Database Systems for Advanced Applications
Keywords
DocType
ISSN
Graph neural networks, Dynamic graph, Catastrophic forgetting, Incremental learning
Conference
0302-9743
Citations 
PageRank 
References 
0
0.34
0
Authors
6
Name
Order
Citations
PageRank
Wei Di100.34
Yu Gu220134.98
Song Yumeng300.34
Song Zhen400.34
Li Fangfang500.34
Ge YU61313175.88