Title
Leveraging Context Information For Joint Entity And Relation Linking
Abstract
As an important module in most knowledge base question answering (KBQA) systems, entity and relation linking maps proper nouns and relational phrases to corresponding semantic constructs (entities and relations, respectively) in a given KB. Because different entities/relations may have the same mentions, joint disambiguation has been proposed to identify the exact entity/relation from a list of candidates using context information. Existing joint disambiguation methods, like the method in EARL (Entity and Relation Linker), mainly focus on modeling the co-occurrence probabilities of different entities and relations in input questions, while paying little attention to other non-mention expressions (e.g., wh-words). In this paper, we propose the Extended Entity and Relation Linker (EEARL), which leverages full context information to improve linking accuracy. EEARL firstly extracts the context information for each mention and the attribute features for each entity/relation via character-level and word-level LSTMs and constructs context vectors and feature vectors, respectively, and then calculates the similarity between the two vectors to rescore all the candidates. Experimental results on two benchmark datasets (LC-QuAD and QALD) show that EEARL outperforms EARL and several baseline methods in terms of both entity linking and relation linking accuracy.
Year
DOI
Venue
2019
10.1007/978-3-030-33982-1_3
WEB AND BIG DATA, APWEB-WAIM 2019
Keywords
Field
DocType
Entity linking, Relation linking, Joint entity and relation linking, knowledge base question answering, Context information
Entity linking,Data mining,Feature vector,Expression (mathematics),Computer science,Knowledge base question answering,Artificial intelligence,Natural language processing,Proper noun
Conference
Volume
ISSN
Citations 
11809
0302-9743
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Yao Zhao11926219.11
Zhuoming Xu2114.29
Yuzhong Qu372662.49