Title
HW-TSC's Participation in the IWSLT 2022 Isometric Spoken Language Translation.
Abstract
This paper presents our submissions to the IWSLT 2022 Isometric Spoken Language Translation task. We participate in all three language pairs (English-German, English-French, English-Spanish) under the constrained setting, and submit an English-German result under the unconstrained setting. We use the standard Transformer model as the baseline and obtain the best performance via one of its variants that shares the decoder input and output embedding. We perform detailed pre-processing and filtering on the provided bilingual data. Several strategies are used to train our models, such as Multilingual Translation, Back Translation, Forward Translation, R-Drop, Average Checkpoint, and Ensemble. We investigate three methods for biasing the output length: i) conditioning the output to a given target-source length-ratio class; ii) enriching the transformer positional embedding with length information and iii) length control decoding for non-autoregressive translation etc. Our submissions achieve 30.7, 41.6 and 36.7 BLEU respectively on the tst-COMMON test sets for English-German, English-French, English-Spanish tasks and 100% comply with the length requirements.
Year
DOI
Venue
2022
10.18653/v1/2022.iwslt-1.33
International Conference on Spoken Language Translation (IWSLT)
DocType
Volume
Citations 
Conference
Proceedings of the 19th International Conference on Spoken Language Translation (IWSLT 2022)
0
PageRank 
References 
Authors
0.34
0
12
Name
Order
Citations
PageRank
Zongyao Li100.68
Jiaxin Guo204.73
Daimeng Wei305.07
Hengchao Shang404.39
Minghan Wang505.07
Ting Zhu6294.41
Zhanglin Wu702.37
Zhengzhe Yu802.70
Xiaoyu Chen9369.63
Lizhi Lei1004.06
Hao Yang1107.44
Ying Qin1205.75