Title
Multilingual Neural Machine Translation with Knowledge Distillation.
Abstract
Multilingual machine translation, which translates multiple languages with a single model, has attracted much attention due to its efficiency of offline training and online serving. However, traditional multilingual translation usually yields inferior accuracy compared with the counterpart using individual models for each language pair, due to language diversity and model capacity limitations. In this paper, we propose a distillation-based approach to boost the accuracy of multilingual machine translation. Specifically, individual models are first trained and regarded as teachers, and then the multilingual model is trained to fit the training data and match the outputs of individual models simultaneously through knowledge distillation. Experiments on IWSLT, WMT and Ted talk translation datasets demonstrate the effectiveness of our method. Particularly, we show that one model is enough to handle multiple languages (up to 44 languages in our experiment), with comparable or even better accuracy than individual models.
Year
Venue
Field
2019
ICLR
Computer science,Machine translation,Distillation,Artificial intelligence,Natural language processing,Machine learning
DocType
Volume
Citations 
Journal
abs/1902.10461
1
PageRank 
References 
Authors
0.36
27
6
Name
Order
Citations
PageRank
Xu Tan18823.94
Yi Ren257.55
Di He315419.76
Tao Qin42384147.25
Zhou Zhao577390.87
Tie-yan Liu64662256.32