Title
Embedding Compression with Hashing for Efficient Representation Learning in Large-Scale Graph
Abstract
Graph neural networks (GNNs) are deep learning models designed specifically for graph data, and they typically rely on node features as the input to the first layer. When applying such a type of network on the graph without node features, one can extract simple graph-based node features (e.g., number of degrees) or learn the input node representations (i.e., embeddings) when training the network. While the latter approach, which trains node embeddings, more likely leads to better performance, the number of parameters associated with the embeddings grows linearly with the number of nodes. It is therefore impractical to train the input node embeddings together with GNNs within graphics processing unit (GPU) memory in an end-to-end fashion when dealing with industrial-scale graph data. Inspired by the embedding compression methods developed for natural language processing (NLP) tasks, we develop a node embedding compression method where each node is compactly represented with a bit vector instead of a floating-point vector. The parameters utilized in the compression method can be trained together with GNNs. We show that the proposed node embedding compression method achieves superior performance compared to the alternatives.
Year
DOI
Venue
2022
10.1145/3534678.3539068
KDD '22: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
1
9
Name
Order
Citations
PageRank
Chin-Chia Michael Yeh100.34
Mengting Gu200.34
Yan Zheng302.37
Huiyuan Chen4247.11
Javid Ebrahimi500.68
Zhongfang Zhuang683.54
Junpeng Wang710110.27
Liang Wang84317243.28
Wei Zhang9211.88