Title
Neural Entity Linking on Technical Service Tickets
Abstract
Entity linking, the task of mapping textual mentions to known entities, has recently been tackled using contextualized neural networks. We address the question whether these results — reported for large, high-quality datasets such as Wikipedia — transfer to practical business use cases, where labels are scarce, text is low-quality, and terminology is highly domain-specific.Using an entity linking model based on BERT, a popular transformer network in natural language processing, we show that a neural approach outperforms and complements hand-coded heuristics, with improvements of about 20% top-1 accuracy. Also, the benefits of transfer learning on a large corpus are demonstrated, while fine-tuning proves difficult. Finally, we compare different BERT-based architectures and show that a simple sentence-wise encoding (Bi-Encoder) offers a fast yet efficient search in practice.
Year
DOI
Venue
2020
10.1109/SDS49233.2020.00014
2020 7th Swiss Conference on Data Science (SDS)
Keywords
DocType
ISBN
Entity Linking,Attention Models,Natural Language Processing
Conference
978-1-7281-7177-7
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Kurz Nadja100.34
Hamann Felix200.34
Adrian Ulges332826.61