Title
Tmu Transformer System Using Bert For Re-Ranking At Bea 2019 Grammatical Error Correction On Restricted Track
Abstract
We introduce our system that is submitted to the restricted track of the BEA 2019 shared task on grammatical error correction(1) (GEC). It is essential to select an appropriate hypothesis sentence from the candidates list generated by the GEC model. A re-ranker can evaluate the naturalness of a corrected sentence using language models trained on large corpora. On the other hand, these language models and language representations do not explicitly take into account the grammatical errors written by learners. Thus, it is not straightforward to utilize language representations trained from a large corpus, such as Bidirectional Encoder Representations from Transformers (BERT), in a form suitable for the learner's grammatical errors. Therefore, we propose to fine-tune BERT on learner corpora with grammatical errors for re-ranking. The experimental results of the W&I+LOCNESS development dataset demonstrate that re-ranking using BERT can effectively improve the correction performance.
Year
DOI
Venue
2019
10.18653/v1/w19-4422
INNOVATIVE USE OF NLP FOR BUILDING EDUCATIONAL APPLICATIONS
Field
DocType
Citations 
Ranking,Computer science,Transformer,Algorithm,Error detection and correction
Conference
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Masahiro Kaneko127.85
Kengo Hotate201.35
Satoru Katsumata303.38
Mamoru Komachi424144.56