Title | ||
---|---|---|
Stronger Baselines for Grammatical Error Correction Using Pretrained Encoder-Decoder Model |
Abstract | ||
---|---|---|
Grammatical error correction (GEC) literature has reported on the effectiveness of pretraining a Seq2Seq model with a large amount of pseudo data. In this study, we explored two generic pretrained encoder-decoder (Enc-Dec) models, including BART, which reported the state-of-the-art (SOTA) results for several Seq2Seq tasks other than GEC. We found that monolingual and multilingual BART models achieve high performance in GEC, including a competitive result compared with the current SOTA result in English GEC. Our implementations will be publicly available at GitHub. |
Year | Venue | DocType |
---|---|---|
2020 | AACL/IJCNLP | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Satoru Katsumata | 1 | 0 | 3.38 |
Mamoru Komachi | 2 | 241 | 44.56 |