Title
Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation
Abstract
ABSTRACTCold-start problem is a fundamental challenge for recommendation tasks. Despite the recent advances on Graph Neural Networks (GNNs) incorporate the high-order collaborative signal to alleviate the problem, the embeddings of the cold-start users and items aren't explicitly optimized, and the cold-start neighbors are not dealt with during the graph convolution in GNNs. This paper proposes to pre-train a GNN model before applying it for recommendation. Unlike the goal of recommendation, the pre-training GNN simulates the cold-start scenarios from the users/items with sufficient interactions and takes the embedding reconstruction as the pretext task, such that it can directly improve the embedding quality and can be easily adapted to the new cold-start users/items. To further reduce the impact from the cold-start neighbors, we incorporate a self-attention-based meta aggregator to enhance the aggregation ability of each graph convolution step, and an adaptive neighbor sampler to select the effective neighbors according to the feedbacks from the pre-training GNN model. Experiments on three public recommendation datasets show the superiority of our pre-training GNN model against the original GNN models on user/item embedding inference and the recommendation task.
Year
DOI
Venue
2021
10.1145/3437963.3441738
WSDM
DocType
Citations 
PageRank 
Conference
2
0.36
References 
Authors
0
5
Name
Order
Citations
PageRank
Bowen Hao141.06
Jing Zhang2128155.47
Hongzhi Yin3136475.83
Cuiping Li4399.19
Hong Chen5259.84