Title
Minimize Exposure Bias of Seq2Seq Models in Joint Entity and Relation Extraction
Abstract
Joint entity and relation extraction aims to extract relation triplets from plain text directly. Prior work leverages Sequence-to-Sequence (Seq2Seq) models for triplet sequence generation. However, Seq2Seq enforces an unnecessary order on the unordered triplets and involves a large decoding length associated with error accumulation. These introduce exposure bias, which may cause the models overfit to the frequent label combination, thus deteriorating the generalization. We propose a novel Sequence-to-Unordered-Multi-Tree (Seq2UMTree) model to minimize the effects of exposure bias by limiting the decoding length to three within a triplet and removing the order among triplets. We evaluate our model on two datasets, DuIE and NYT, and systematically study how exposure bias alters the performance of Seq2Seq models. Experiments show that the state-of-the-art Seq2Seq model overfits to both datasets while Seq2UMTree shows significantly better generalization. Our code is available at https://github.com/WindChimeRan/OpenJERE .
Year
DOI
Venue
2020
10.18653/V1/2020.FINDINGS-EMNLP.23
EMNLP
DocType
Volume
Citations 
Conference
2020.findings-emnlp
0
PageRank 
References 
Authors
0.34
29
9
Name
Order
Citations
PageRank
Haoran Zhang110.69
Qianying Liu233.06
Aysa Xuemo Fan300.34
Heng Ji41544127.27
Daojian Zeng537013.02
Fei Cheng601.69
Daisuke Kawahara770561.89
Sadao Kurohashi81083177.05
Ranran Haoran Zhang900.34