Title
Hybrid neural tagging model for open relation extraction
Abstract
Open Relation Extraction (ORE) task remains a challenge to obtain a semantic representation by discovering arbitrary relations from the unstructured text. Conventional methods heavily depend on feature engineering or syntactic parsing, which are inefficient or error-cascading. Recently, leveraging supervised deep learning methods to address the ORE task is a promising way. However, there are two main challenges: (1) The lack of enough labeled corpus to support supervised training; (2) The exploration of specific neural architecture that adapts to the characteristics of open relation extracting. In this paper, we build a large-scale, high-quality training corpus in a fully automated way. And wedesign a tagging scheme to assist in transforming the ORE task into a sequence tagging processing. Furthermore, we propose a hybrid neural network model (HNN4ORT) for open relation tagging. The model employs the Ordered Neurons LSTM to encode potential syntactic information to capture the associations among the arguments and relations. It also emerges a novel Dual Aware Mechanism, including Local-aware Attention and Global-aware Convolution. The dual awarenesses complement each other. Takes the sentence-level semantics as a global perspective, and at the same time, the model implements salient local features to achieve sparse annotation. Experiment results on various testing sets show that our model achieves state-of-the-art performance compared toconventional methods or other neural models.
Year
DOI
Venue
2022
10.1016/j.eswa.2022.116951
Expert Systems with Applications
Keywords
DocType
Volume
Open relation extraction,Neural sequence tagging,Syntactics,Supervised,Corpus
Journal
200
ISSN
Citations 
PageRank 
0957-4174
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Shengbin Jia100.34
E. Shijia200.34
Ling Ding302.03
Xiaojun Chen41298107.51
Yang Xiang512314.35