Title
MGAD: Learning Descriptional Representation Distilled from Distributional Semantics for Unseen Entities.
Abstract
Entity representation plays a central role in building effective entity retrieval models. Recent works propose to learn entity representations based on entity-centric contexts, which achieve SOTA performances on many tasks. However, these methods lead to poor representations for unseen entities since they rely on a multitude of occurrences for each entity to enable accurate representations. To address this issue, we propose to learn enhanced descriptional representations for unseen entities by distilling knowledge from distributional semantics into descriptional embeddings. Specifically, we infer enhanced embeddings for unseen entities based on descriptions by aligning the descriptional embedding space to the distributional embedding space with different granularities, i.e., element-level, batch-level and space-level alignment. Experimental results on four benchmark datasets show that our approach improves the performance over all baseline methods. In particular, our approach can achieve the effectiveness of the teacher model on almost all entities, and maintain such high performance on unseen entities.
Year
DOI
Venue
2022
10.24963/ijcai.2022/611
European Conference on Artificial Intelligence
Keywords
DocType
Citations 
Natural Language Processing: Named Entities,Natural Language Processing: Information Retrieval and Text Mining,Natural Language Processing: Coreference Resolution,Natural Language Processing: Embeddings,Natural Language Processing: Natural Language Semantics
Conference
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Yuanzheng Wang1813.97
Xueqi Cheng23148247.04
Yixing Fan320219.39
Xiaofei Zhu400.34
Huasheng Liang500.68
Qiang Yan600.34
Jiafeng Guo71737102.17