Title
A Simple And Effective Approach To Coverage-Aware Neural Machine Translation
Abstract
We offer a simple and effective method to seek a better balance between model confidence and length preference for Neural Machine Translation (NMT). Unlike the popular length normalization and coverage models, our model does not require training nor reranking the limited n-best outputs. Moreover, it is robust to large beam sizes, which is not well studied in previous work. On the Chinese-English and English-German translation tasks, our approach yields +0.4 similar to similar to 1.5 BLEU improvements over the state-of-the-art baselines.
Year
Venue
Field
2018
PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2
Computer science,Machine translation,Artificial intelligence,Natural language processing
DocType
Volume
Citations 
Conference
P18-2
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Yanyang Li131.42
Tong Xiao213123.91
Yinqiao Li362.47
Qiang Wang430.72
Changming Xu500.34
Jingbo Zhu670364.21