Title | ||
---|---|---|
Rakuten'S Participation In Wat 2021: Examining The Effectiveness Of Pre-Trained Models For Multilingual And Multimodal Machine Translation |
Abstract | ||
---|---|---|
This paper introduces our neural machine translation systems' participation in the WAT 2021 shared translation tasks (team ID: sakura). We participated in the (i) NICT-SAP, (ii) Japanese-English multimodal translation, (iii) Multilingual Indic, and (iv) Myanmar-English translation tasks. Multilingual approaches such as mBART (Liu et al., 2020) are capable of pre-training a complete, multilingual sequence-to-sequence model through denoising objectives, making it a great starting point for building multilingual translation systems. Our main focus in this work is to investigate the effectiveness of multilingual finetuning on such a multilingual language model on various translation tasks, including low-resource, multimodal, and mixed-domain translation. We further explore a multimodal approach based on universal visual representation (Zhang et al., 2019) and compare its performance against a unimodal approach based on mBART alone. |
Year | Venue | DocType |
---|---|---|
2021 | WAT 2021: THE 8TH WORKSHOP ON ASIAN TRANSLATION | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Raymond Hendy Susanto | 1 | 0 | 0.34 |
Dongzhe Wang | 2 | 0 | 0.34 |
Sunil Yadav | 3 | 0 | 0.34 |
Mausam Jain | 4 | 0 | 0.34 |
Ohnmar Htun | 5 | 0 | 0.34 |