Abstract | ||
---|---|---|
Word embeddings are now a standard technique for inducing meaning representations for words. For getting good representations, it is important to take into account different senses of a word. In this paper, we propose a mixture model for learning multi-sense word embeddings. Our model generalizes the previous works in that it allows to induce different weights of different senses of a word. The experimental results show that our model outperforms previous models on standard evaluation tasks. |
Year | DOI | Venue |
---|---|---|
2017 | 10.18653/v1/S17-1015 | *SEM |
DocType | Volume | Citations |
Conference | abs/1706.05111 | 2 |
PageRank | References | Authors |
0.35 | 33 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Dai Quoc Nguyen | 1 | 107 | 13.49 |
Dat Quoc Nguyen | 2 | 246 | 25.87 |
Ashutosh Modi | 3 | 52 | 6.16 |
Stefan Thater | 4 | 756 | 38.54 |
Manfred Pinkal | 5 | 1116 | 69.77 |