Title | ||
---|---|---|
Automatic Selection of Context Configurations for Improved (and Fast) Class-Specific Word Representations. |
Abstract | ||
---|---|---|
Recent work has demonstrated that state-of-the-art word embedding models require different context types to produce high-quality representations for different word classes such as adjectives (A), verbs (V), and nouns (N). This paper is concerned with identifying contexts useful for learning A/V/N-specific representations. We introduce a simple yet effective framework for selecting class-specific context configurations that yield improved representations for each class. We propose an automatic A* style selection algorithm that effectively searches only a fraction of the large configuration space. The results on predicting similarity scores for the A, V, and N subsets of the benchmarking SimLex-999 evaluation set indicate that our method is useful for each class: the improvements are 6% (A), 6% (V), and 5% (N) over the best previously proposed context type for each class. At the same time, the model trains on only 14% (A), 26.2% (V), and 33.6% (N) of all dependency-based contexts, resulting in much shorter training time. |
Year | Venue | DocType |
---|---|---|
2016 | CoRR | Journal |
Volume | Citations | PageRank |
abs/1608.05528 | 2 | 0.36 |
References | Authors | |
33 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ivan Vulic | 1 | 462 | 52.59 |
Roy Schwartz | 2 | 184 | 14.76 |
Ari Rappoport | 3 | 1816 | 129.95 |
Roi Reichart | 4 | 760 | 53.53 |
Anna Korhonen | 5 | 1336 | 92.50 |