Abstract | ||
---|---|---|
The paper presents the submission by HW-TSC in the WMT 2020 Automatic Post Editing Shared Task. We participate in the English-German and English-Chinese language pairs. Our system is built based on the Transformer pre-trained on WMT 2019 and WMT 2020 News Translation corpora, and fine-tuned on the APE corpus. Bottleneck Adapter Layers are integrated into the model to prevent over-fitting. We further collect external translations as the augmented MT candidates to improve the performance. The experiment demonstrates that pre-trained NMT models are effective when fine-tuning with the APE corpus of a limited size, and the performance can be further improved with external MT augmentation. Our system achieves competitive results on both directions in the final evaluation. |
Year | Venue | DocType |
---|---|---|
2020 | WMT@EMNLP | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
11 |
Name | Order | Citations | PageRank |
---|---|---|---|
Hao Yang | 1 | 0 | 7.44 |
Minghan Wang | 2 | 0 | 2.03 |
Daimeng Wei | 3 | 0 | 5.07 |
Hengchao Shang | 4 | 0 | 4.39 |
Jiaxin Guo | 5 | 0 | 1.69 |
Zongyao Li | 6 | 0 | 2.70 |
Lizhi Lei | 7 | 0 | 4.06 |
Ying Qin | 8 | 0 | 5.75 |
Shimin Tao | 9 | 0 | 4.73 |
Shiliang Sun | 10 | 1 | 2.05 |
Yimeng Chen | 11 | 0 | 4.06 |