Abstract | ||
---|---|---|
Recently, end-to-end neural network-based approaches have shown significant improvements over traditional pipeline-based models in English coreference resolution. However, such advancements came at a cost of computational complexity and recent works have not focused on tackling this problem. Hence, in this paper, to cope with this issue, we propose BERT-SRU-based Pointer Networks that leverages the linguistic property of head-final languages. Applying this model to the Korean coreference resolution, we significantly reduce the coreference linking search space. Combining this with Ensemble Knowledge Distillation, we maintain state-of-the-art performance 66.9% of CoNLL F1 on ETRI test set while achieving 2x speedup (30 doc/sec) in document processing time. |
Year | DOI | Venue |
---|---|---|
2020 | 10.18653/V1/2020.FINDINGS-EMNLP.237 | EMNLP |
DocType | Volume | Citations |
Conference | 2020.findings-emnlp | 0 |
PageRank | References | Authors |
0.34 | 0 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Cheon-Eum Park | 1 | 1 | 3.05 |
Jamin Shin | 2 | 9 | 2.84 |
Sungjoon Park | 3 | 19 | 4.43 |
Joonho Lim | 4 | 0 | 0.34 |
Changki Lee | 5 | 279 | 26.18 |