Title
Approaching Neural Grammatical Error Correction as a Low-Resource Machine Translation Task.
Abstract
Previously, neural methods in grammatical error correction (GEC) did not reach state-of-the-art results compared to phrase-based statistical machine translation (SMT) baselines. We demonstrate parallels between neural GEC and low-resource neural MT and successfully adapt several methods from low-resource MT to neural GEC. We further establish guidelines for trustable results in neural GEC and propose a set of model-independent methods for neural GEC that can be easily applied in most GEC settings. Proposed methods include adding source-side noise, domain-adaptation techniques, a GEC-specific training-objective, transfer learning with monolingual data, and ensembling of independently trained GEC models and language models. The combined effects of these methods result in better than state-of-the-art neural GEC models that outperform previously best neural GEC systems by more than 10% M$^2$ on the CoNLL-2014 benchmark and 5.9% on the JFLEG test set. Non-neural state-of-the-art systems are outperformed by more than 2% on the CoNLL-2014 benchmark and by 4% on JFLEG.
Year
DOI
Venue
2018
10.18653/v1/N18-1055
north american chapter of the association for computational linguistics
DocType
Volume
Citations 
Journal
abs/1804.05940
6
PageRank 
References 
Authors
0.48
23
4
Name
Order
Citations
PageRank
Marcin Junczys-Dowmunt131224.24
Roman Grundkiewicz210911.75
Shubha Guha360.48
Kenneth Heafield471.86