Title
Dynamic Curriculum Learning for Low-Resource Neural Machine Translation
Abstract
Large amounts of data has made neural machine translation (NMT) a big success in recent years. But it is still a challenge if we train these models on small-scale corpora. In this case, the way of using data appears to be more important. Here, we investigate the effective use of training data for low-resource NMT. In particular, we propose a dynamic curriculum learning (DCL) method to reorder training samples in training. Unlike previous work, we do not use a static scoring function for reordering. Instead, the order of training samples is dynamically determined in two ways - loss decline and model competence. This eases training by highlighting easy samples that the current model has enough competence to learn. We test our DCL method in a Transformer-based system. Experimental results show that DCL outperforms several strong baselines on three low-resource machine translation benchmarks and different sized data of WMT' 16 En-De.
Year
Venue
DocType
2020
COLING
Conference
Volume
Citations 
PageRank 
2020.coling-main
0
0.34
References 
Authors
0
9
Name
Order
Citations
PageRank
Chen Xu100.34
Bojie Hu202.37
Yufan Jiang300.34
Kai Feng400.34
Zeyang Wang500.34
Shen Huang66414.51
Qi Ju700.34
Tong Xiao813123.91
Jingbo Zhu970364.21