Title
Improving Zero-shot Translation with Language-Independent Constraints.
Abstract
An important concern in training multilingual neural machine translation (NMT) is to translate between language pairs unseen during training, i.e zero-shot translation. Improving this ability kills two birds with one stone by providing an alternative to pivot translation which also allows us to better understand how the model captures information between languages. In this work, we carried out an investigation on this capability of the multilingual NMT models. First, we intentionally create an encoder architecture which is independent with respect to the source language. Such experiments shed light on the ability of NMT encoders to learn multilingual representations, in general. Based on such proof of concept, we were able to design regularization methods into the standard Transformer model, so that the whole architecture becomes more robust in zero-shot conditions. We investigated the behaviour of such models on the standard IWSLT 2017 multilingual dataset. We achieved an average improvement of 2.23 BLEU points across 12 language pairs compared to the zero-shot performance of a state-of-the-art multilingual system. Additionally, we carry out further experiments in which the effect is confirmed even for language pairs with multiple intermediate pivots.
Year
DOI
Venue
2019
10.18653/v1/w19-5202
FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019), VOL 1: RESEARCH PAPERS
DocType
Volume
Citations 
Conference
abs/1906.08584
1
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Ngoc-Quan Pham131.74
Jan Niehues225939.48
Thanh-Le Ha33110.83
Alex Waibel463431980.68