Title
Chinese Grammatical Correction Using BERT-based Pre-trained Model
Abstract
In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a BERT-based pre-trained model developed by Cui et al. (2020) into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.
Year
Venue
DocType
2020
AACL/IJCNLP
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Hongfei Wang172.34
Michiki Kurosawa200.68
Satoru Katsumata303.38
Mamoru Komachi424144.56