Title
Talking-heads attention-based knowledge representation for link prediction
Abstract
State-of-the-art methods for link prediction, also known as knowledge graph embedding, aim to represent both entities and relations in the given knowledge graphs (KGs) into a continuous low-dimensional vector space and thus could be used to fill the missing facts or identify the spurious facts in KGs, where a fact is represented as a triple in the form of (head entity, relation, tail entity). Most previous attempts solely learn triples independently and thus fail to utilize the rich hidden inference and semantic information in the local neighbourhood existed surrounding each triple in KGs. To this effect, this paper proposes a talking-heads attention-based knowledge representation method, a novel graph attention networks-based method for link prediction which learns the knowledge graph embedding with talking-heads attention guidance from multi hop neighbourhood triples. We evaluate our model in Freebase, WordNet and Kinship datasets on link prediction, experimental results demonstrate that the injection of talking-heads attention mechanism could better capture the semantic relationship of the neighbourhood surrounding triples and indeed achieve promising performance on link prediction.
Year
DOI
Venue
2022
10.1016/j.csl.2021.101340
COMPUTER SPEECH AND LANGUAGE
Keywords
DocType
Volume
Knowledge representation, Link prediction, Talking-heads attention
Journal
74
ISSN
Citations 
PageRank 
0885-2308
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Shirui Wang100.68
Wenan Zhou25019.20
Qiang Zhou300.68