Title
Time-Aware Representation Learning of Knowledge Graphs
Abstract
Representation learning is a fundamental task in knowledge graph-related research and applications. Most existing approaches learn representations for entities and relations only based on static facts, where temporal information has been ignored completely. This paper aims to learn time-aware representations for entities and relations in knowledge graphs. Based on how temporal information affects the learned embeddings, we propose three assumptions and build three different models, BTS, ETS, and RTS, respectively. In these models, we build two separate embedding spaces for entities and relations, the standard translation condition is checked after projecting embedding vectors between these spaces by model-specific transformations. As to the performance, the proposed RTS model achieves state-of-the-art results in three experiments conducted on two datasets: YAGO11k and Wikidata12k, which validates the effectiveness of our model. Comparing the results of all three models, we find that relation embeddings are time-sensitive and form natural ordering, while the effects of time on entity embeddings can be safely ignored for translation-based methods. Experiments also show that our findings can be used to simplify other existing models like HyTE.
Year
DOI
Venue
2021
10.1109/IJCNN52387.2021.9533920
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)
DocType
ISSN
Citations 
Conference
2161-4393
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Zikang Wang101.35
Linjing Li23912.91
Daniel Zeng32539286.59