Title
Evaluating the Impact of Knowledge Graph Contexton Entity Disambiguation Models
Abstract
Pretrained Transformer models have emerged as state-of-the-art approaches that learn contextual information from the text to improve the performance of several NLP tasks. These models, albeit powerful, still require specialized knowledge in specific scenarios. In this paper, we argue that context derived from a knowledge graph (in our case: Wikidata) provides enough signals to inform pretrained transformer models and improve their performance for named entity disambiguation (NED) on Wikidata KG. We further hypothesize that our proposed KG context can be standardized for Wikipedia, and we evaluate the impact of KG context on the state of the art NED model for the Wikipedia knowledge base. Our empirical results validate that the proposed KG context can be generalized (for Wikipedia), and providing KG context in transformer architectures considerably outperforms the existing baselines, including the vanilla transformer models.
Year
DOI
Venue
2020
10.1145/3340531.3412159
CIKM '20: The 29th ACM International Conference on Information and Knowledge Management Virtual Event Ireland October, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-6859-9
0
PageRank 
References 
Authors
0.34
17
6
Name
Order
Citations
PageRank
Isaiah Onando Mulang'101.01
Kuldeep Singh216922.88
Chaitali Prabhu300.34
Abhishek Nadgeri421.70
Johannes Hoffart5136252.62
Jens Lehmann65375355.08