Abstract | ||
---|---|---|
This paper described NiuTrans neural machine translation systems for the WMT 2019 news translation tasks. We participated in 13 translation directions, including 11 supervised tasks, namely EN <->{ZH, DE, RU, KK, LT}, GU -> EN and the unsupervised DE <-> CS subtrack. Our systems were built on deep Transformer and several back-translation methods. Iterative knowledge distillation and ensemble+reranking were also employed to obtain stronger models. Our unsupervised submissions were based on NMT enhanced by SMT. As a result, we achieved the highest BLEU scores in {KK <-> EN, GU -> EN} directions, ranking 2nd in {RU -> EN, DE <-> CS} and 3rd in {ZH -> EN, LT -> EN, EN -> RU, EN <-> DE} among all constrained submissions. |
Year | DOI | Venue |
---|---|---|
2019 | 10.18653/v1/w19-5325 | FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019) |
DocType | Citations | PageRank |
Conference | 1 | 0.36 |
References | Authors | |
0 | 17 |
Name | Order | Citations | PageRank |
---|---|---|---|
Bei Li | 1 | 1 | 3.06 |
Yinqiao Li | 2 | 1 | 1.04 |
Chen Xu | 3 | 89 | 17.69 |
Lin Ye | 4 | 8 | 5.64 |
Jiqiang Liu | 5 | 315 | 52.31 |
Hui Liu | 6 | 116 | 29.48 |
Ziyang Wang | 7 | 1 | 0.36 |
Yuhao Zhang | 8 | 1 | 0.36 |
Nuo Xu | 9 | 14 | 7.66 |
Zeyang Wang | 10 | 1 | 0.36 |
Kai Feng | 11 | 1 | 0.36 |
Hexuan Chen | 12 | 1 | 0.36 |
Tengbo Liu | 13 | 1 | 0.36 |
Yanyang Li | 14 | 3 | 1.42 |
Qiang Wang | 15 | 436 | 66.63 |
Tong Xiao | 16 | 131 | 23.91 |
Jingbo Zhu | 17 | 703 | 64.21 |