Title
COSINE: Compressive Network Embedding on Large-Scale Information Networks
Abstract
There is recently a surge in approaches that learn low-dimensional embeddings of nodes in networks. However, for large-scale real-world networks, it’s inefficient for existing approaches to store amounts of parameters in memory and update them edge by edge. With the knowledge that nodes having similar neighborhoods will be close to each other in the embedding space, we propose COSINE (COmpresSIve Network Embedding) algorithm, which reduces the memory footprint and accelerates the training process by parameter sharing among similar nodes. COSINE applies graph partitioning algorithms to networks and builds parameter sharing dependency of nodes based on the results of partitioning. In this way, COSINE injects prior knowledge about high-order structural information into models, which makes network embedding more efficient and effective. COSINE can be applied to any <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">embedding lookup</i> method and learn high-quality embeddings with limited memory and less training time. We conduct experiments on multi-label classification and link prediction, where baselines and our model have the same memory usage. Experimental results show that COSINE improves baselines by up to 23 percent on classification and 25 percent on link prediction. Moreover, the training time of all representation learning methods using COSINE decreases by 30 to 70 percent.
Year
DOI
Venue
2022
10.1109/TKDE.2020.3030539
IEEE Transactions on Knowledge and Data Engineering
Keywords
DocType
Volume
Node classification,link prediction,large-scale real-world network,network embedding,model compression
Journal
34
Issue
ISSN
Citations 
8
1041-4347
0
PageRank 
References 
Authors
0.34
24
7
Name
Order
Citations
PageRank
Zhengyan Zhang11058.78
Cheng Yang200.34
Zhiyuan Liu32037123.68
Maosong Sun42293162.86
Zhichong Fang500.34
Bo Zhang6419.80
Leyu Lin75614.37