Abstract | ||
---|---|---|
Relation extraction is an important NLP task to extract the semantic relationship between two entities. Recently, large-scale pre-training language models have achieved excellent performance in many NLP applications. Most of the existing relation extraction models mainly rely on context information, but entity information is also very important for relation extraction, especially domain knowledge of entity and the direction between entity pairs. In this paper, based on the pre-trained BERT model, we propose a multi-task joint relation extraction model incorporating knowledge representation learning(KRL). The experimental results on the SemEval 2010 task 8 dataset and the KBP37 dataset show that our proposed model outperforms most of state-of-the-art methods. The results on the larger dataset FewRe180 refined from FewRel also indicate that increasing the knowledge representation learning as an auxiliary objective is helpful for the relation extraction task. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/ICTAI52525.2021.00191 | 2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021) |
Keywords | DocType | ISSN |
Knowledge representation learning, Pretraining, Relation extraction, Multi-task learning | Conference | 1082-3409 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wenxing Hong | 1 | 42 | 7.61 |
Shuyan Li | 2 | 0 | 0.34 |
Zhiqiang Hu | 3 | 0 | 0.34 |
Abdur Rasool | 4 | 0 | 1.35 |
Qingshan Jiang | 5 | 0 | 0.34 |
qingan | 6 | 122 | 12.38 |