Abstract | ||
---|---|---|
This paper describes NiuTrans neural machine translation systems of the WMT 2021 news translation tasks. We made submissions to 9 language directions, including English$\leftrightarrow$$\{$Chinese, Japanese, Russian, Icelandic$\}$ and English$\rightarrow$Hausa tasks. Our primary systems are built on several effective variants of Transformer, e.g., Transformer-DLCL, ODE-Transformer. We also utilize back-translation, knowledge distillation, post-ensemble, and iterative fine-tuning techniques to enhance the model performance further. |
Year | Venue | DocType |
---|---|---|
2021 | WMT@EMNLP | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
18 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shuhan Zhou | 1 | 0 | 1.01 |
Tao Zhou | 2 | 2744 | 152.77 |
Binghao Wei | 3 | 0 | 1.01 |
Yingfeng Luo | 4 | 0 | 1.01 |
Yongyu Mu | 5 | 0 | 1.01 |
Zefan Zhou | 6 | 0 | 0.34 |
Chenglong Wang | 7 | 2 | 2.08 |
Xuanjun Zhou | 8 | 0 | 0.68 |
Chuanhao Lv | 9 | 0 | 0.34 |
Yi Jing | 10 | 0 | 1.35 |
Laohu Wang | 11 | 0 | 0.68 |
Jingnan Zhang | 12 | 0 | 1.01 |
Canan Huang | 13 | 0 | 1.01 |
Zhongxiang Yan | 14 | 0 | 0.68 |
Chi Hu | 15 | 3 | 2.10 |
Bei Li | 16 | 1 | 3.06 |
Tong Xiao | 17 | 131 | 23.91 |
Jingbo Zhu | 18 | 703 | 64.21 |