Title
ETIV: Embedding Temporal Network via Interest Vector
Abstract
Network representation learning has attracted more and more attention, whether in academia or industry. It can convert large scale network data into low dimensional node embedding for various tasks, such as node classification, link prediction, etc. Recently, representation learning on dynamic networks has emerged, which is more similar to the situation in real life. Inspired by the interaction between entities and entities in real life, we propose a model to embedding temporal network via interest vectors (ETIV). The interest vector of an entity can be generated according to the historical entities of the interaction. Interest vector can infer the possibility of interaction between entities. Then, we can obtain the node representation by optimizing the possibility of interaction. For calculating the interest vector of a node, we use a multi-head attention mechanism to capture the information of the historical interaction nodes from different aspects. Moreover, according to the interaction time of historical nodes, we introduce a learnable time parameter to simulate the forgetting of historical information. We conduct experiments on three real data sets and find that our model performs better than state-of-the-art methods in various tasks.
Year
DOI
Venue
2021
10.1109/IJCNN52387.2021.9534129
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
Keywords
DocType
ISSN
Network Representation Learning, Temporal Networks, Interest Vector, Time Parameter
Conference
2161-4393
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Jiangting Fan100.34
Haojie Chen200.34
Jiaming Wu300.34
Yong Liu402.03
Nan Wang59327.47