Title | ||
---|---|---|
Relation Classification via Keyword-Attentive Sentence Mechanism and Synthetic Stimulation Loss. |
Abstract | ||
---|---|---|
Previous studies have shown that attention mechanisms and shortest dependency paths have a positive effect on relation classification. In this paper, a keyword-attentive sentence mechanism is proposed to effectively combine the two methods. Furthermore, to effectively handle the imbalanced classification problem, this paper proposes a new loss function called the synthetic stimulation loss, which uses a modulating factor to allow the model to focus on hard-to-classify samples. The proposed two methods are integrated into a bidirectional gated recurrent unit BiGRU. As a single model is not strong in noise immunity, this paper applies the mutual learning method to our model and forces the networks to teach each other. Therefore, we call the final model SSL-KAS-MuBiGRU. Experiments on the SemEval-2010 Task 8 data set and the TAC40 data set demonstrate that the keyword-attentive sentence mechanism and synthetic stimulation loss are useful for relation classification, and our model achieves state-of-the-art results.
|
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/TASLP.2019.2921726 | IEEE/ACM Trans. Audio, Speech & Language Processing |
Keywords | Field | DocType |
Feature extraction,Semantics,Task analysis,Syntactics,Adaptation models,Neural networks,Kernel | Kernel (linear algebra),Pattern recognition,Task analysis,Computer science,Speech recognition,Feature extraction,Artificial intelligence,Relation classification,Mutual learning,Artificial neural network,Sentence,Semantics | Journal |
Volume | Issue | ISSN |
27 | 9 | 2329-9290 |
Citations | PageRank | References |
1 | 0.35 | 13 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Luoqin Li | 1 | 1 | 0.35 |
Jiabing Wang | 2 | 5 | 2.43 |
Jichang Li | 3 | 6 | 1.42 |
Qianli Ma | 4 | 20 | 5.80 |
jia wei | 5 | 4 | 3.09 |