Title
Transfer Learning across Low-Resource, Related Languages for Neural Machine Translation.
Abstract
We present a simple method to improve neural translation of a low-resource language pair using parallel data from a related, also low-resource, language pair. The method is based on the transfer method of Zoph et al., but whereas their method ignores any source vocabulary overlap, ours exploits it. First, we split words using Byte Pair Encoding (BPE) to increase vocabulary overlap. Then, we train a model on the first language pair and transfer its parameters, including its source word embeddings, to another model and continue training on the second language pair. Our experiments show that transfer learning helps word-based translation only slightly, but when used on top of a much stronger BPE baseline, it yields larger improvements of up to 4.3 BLEU.
Year
Venue
DocType
2017
international joint conference on natural language processing
Conference
Volume
Citations 
PageRank 
abs/1708.09803
4
0.41
References 
Authors
5
2
Name
Order
Citations
PageRank
Toan Nguyen15515.70
David Chiang22843144.76