Title | ||
---|---|---|
Neural Latent Relational Analysis to Capture Lexical Semantic Relations in a Vector Space. |
Abstract | ||
---|---|---|
Capturing the semantic relations of words in a vector space contributes to many natural language processing tasks. One promising approach exploits lexico-syntactic patterns as features of word pairs. In this paper, we propose a novel model of this pattern-based approach, neural latent relational analysis (NLRA). NLRA can generalize co-occurrences of word pairs and lexico-syntactic patterns, and obtain embeddings of the word pairs that do not co-occur. This overcomes the critical data sparseness problem encountered in previous pattern-based models. Our experimental results on measuring relational similarity demonstrate that NLRA outperforms the previous pattern-based models. In addition, when combined with a vector offset model, NLRA achieves a performance comparable to that of the state-of-the-art model that exploits additional semantic relational data. |
Year | Venue | DocType |
---|---|---|
2018 | EMNLP | Journal |
Volume | Citations | PageRank |
abs/1809.03401 | 1 | 0.35 |
References | Authors | |
29 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Koki Washio | 1 | 1 | 0.35 |
Tsuneaki Kato | 2 | 271 | 38.70 |