Title
Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution.
Abstract
The use of pretrained masked language models (MLMs) has drastically improved the performance of zero anaphora resolution (ZAR). We further expand this approach with a novel pretraining task and finetuning method for Japanese ZAR. Our pretraining task aims to acquire anaphoric relational knowledge necessary for ZAR from a large-scale raw corpus. The ZAR model is finetuned in the same manner as pretraining. Our experiments show that combining the proposed methods surpasses previous state-of-the-art performance with large margins, providing insight on the remaining challenges.
Year
Venue
DocType
2021
EMNLP
Conference
Volume
Citations 
PageRank 
2021.emnlp-main
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Ryuto Konno100.34
Shun Kiyono203.72
Yuichiroh Matsubayashi3377.26
Hiroki Ouchi4188.08
Kentaro Inui51008120.35