Title
HW-TSC's Participation at WMT 2020 Automatic Post Editing Shared Task.
Abstract
The paper presents the submission by HW-TSC in the WMT 2020 Automatic Post Editing Shared Task. We participate in the English-German and English-Chinese language pairs. Our system is built based on the Transformer pre-trained on WMT 2019 and WMT 2020 News Translation corpora, and fine-tuned on the APE corpus. Bottleneck Adapter Layers are integrated into the model to prevent over-fitting. We further collect external translations as the augmented MT candidates to improve the performance. The experiment demonstrates that pre-trained NMT models are effective when fine-tuning with the APE corpus of a limited size, and the performance can be further improved with external MT augmentation. Our system achieves competitive results on both directions in the final evaluation.
Year
Venue
DocType
2020
WMT@EMNLP
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
11
Name
Order
Citations
PageRank
Hao Yang107.44
Minghan Wang202.03
Daimeng Wei305.07
Hengchao Shang404.39
Jiaxin Guo501.69
Zongyao Li602.70
Lizhi Lei704.06
Ying Qin805.75
Shimin Tao904.73
Shiliang Sun1012.05
Yimeng Chen1104.06