Title
Greedy Search with Probabilistic N-gram Matching for Neural Machine Translation.
Abstract
Neural machine translation (NMT) models are usually trained with the word-level loss using the teacher forcing algorithm, which not only evaluates the translation improperly but also suffers from exposure bias. Sequence-level training under the reinforcement framework can mitigate the problems of the word-level loss, but its performance is unstable due to the high variance of the gradient estimation. On these grounds, we present a method with a differentiable sequence-level training objective based on probabilistic n-gram matching which can avoid the reinforcement framework. In addition, this method performs greedy search in the training which uses the predicted words as context just as at inference to alleviate the problem of exposure bias. Experiment results on the NIST Chinese-to-English translation tasks show that our method significantly outperforms the reinforcement-based algorithms and achieves an improvement of 1.5 BLEU points on average over a strong baseline system.
Year
Venue
DocType
2018
EMNLP
Journal
Volume
Citations 
PageRank 
abs/1809.03132
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Chenze Shao102.70
Xilin Chen26291306.27
Yang Feng330138.39