Title
Embedding Word Similarity with Neural Machine Translation.
Abstract
Neural language models learn word representations, or embeddings, that capture rich linguistic and conceptual information. Here we investigate the embeddings learned by neural machine translation models, a recently-developed class of neural language model. We show that embeddings from translation models outperform those learned by monolingual models at tasks that require knowledge of both conceptual similarity and lexical-syntactic role. We further show that these effects hold when translating from both English to French and English to German, and argue that the desirable properties of translation embeddings should emerge largely independently of the source and target languages. Finally, we apply a new method for training neural translation models with very large vocabularies, and show that this vocabulary expansion algorithm results in minimal degradation of embedding quality. Our embedding spaces can be queried in an online demo and downloaded from our web page. Overall, our analyses indicate that translation-based embeddings should be used in applications that require concepts to be organised according to similarity and/or lexical function, while monolingual embeddings are better suited to modelling (nonspecific) inter-word relatedness.
Year
Venue
Field
2014
International Conference on Learning Representations
Embedding,Web page,Computer science,Machine translation,Artificial intelligence,Natural language processing,Vocabulary,Language model,Machine learning,Lexical function,German
DocType
Volume
Citations 
Journal
abs/1412.6448
9
PageRank 
References 
Authors
0.52
19
5
Name
Order
Citations
PageRank
Felix Hill1100.88
Kyunghyun Cho2265.53
sebastien jean345720.62
Coline Devin41026.55
Yoshua Bengio5426773039.83