Title
Applying A Hybrid Sequential Model To Chinese Sentence Correction
Abstract
In recent years, Chinese has become one of the most popular languages globally. The demand for automatic Chinese sentence correction has gradually increased. This research can be adopted to Chinese language learning to reduce the cost of learning and feedback time, and help writers check for wrong words. The traditional way to do Chinese sentence correction is to check if the word exists in the predefined dictionary. However, this kind of method cannot deal with semantic error. As deep learning becomes popular, an artificial neural network can be applied to understand the sentence's context to correct the semantic error. However, there are still many issues that need to be discussed. For example, the accuracy and the computation time required to correct a sentence are still lacking, so maybe it is still not the time to adopt the deep learning based Chinese sentence correction system to large-scale commercial applications. Our goal is to obtain a model with better accuracy and computation time. Combining recurrent neural network and Bidirectional Encoder Representations from Transformers (BERT), a recently popular model, known for its high performance and slow inference speed, we introduce a hybrid model which can be applied to Chinese sentence correction, improving the accuracy and also the inference speed. Among the results, BERT-GRU has obtained the highest BLEU Score in all experiments. The inference speed of the transformer-based original model can be improved by 1131% in beam search decoding in the 128-word experiment, and greedy decoding can also be improved by 452%. The longer the sequence, the larger the improvement.
Year
DOI
Venue
2020
10.3390/sym12121939
SYMMETRY-BASEL
Keywords
DocType
Volume
BERT, transformer, RNN, deep learning
Journal
12
Issue
Citations 
PageRank 
12
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Jun Wei Chen100.34
Xanno Kharis Sigalingging200.34
Jenq-Shiou Leu323840.64
Jun-ichi Takada423747.62