Title
Embedding Logic Rules Into Recurrent Neural Networks.
Abstract
Incorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rules from the structured knowledge and embedding the extracted logic rule into the RNN, this paper proposes an effective framework to incorporate the prior information in the RNN models. First, we demonstrate that commonly used prior knowledge could be decomposed into a set of logic rules, including the knowledge graph, social graph, and syntactic dependence. Second, we present a technique to embed a set of logic rules into the RNN by the way of feedback masks. Finally, we apply the proposed approach to the sentiment classification and named entity recognition task. The extensive experimental results verify the effectiveness of the embedding approach. The encouraging results suggest that the proposed approach has the potential for applications in other NLP tasks.
Year
DOI
Venue
2019
10.1109/ACCESS.2019.2892140
IEEE ACCESS
Keywords
Field
DocType
RNN,logic rules,sentiment classification,named entity recognition
Embedding,Computer science,Recurrent neural network,Artificial intelligence,Rule of inference,Distributed computing
Journal
Volume
ISSN
Citations 
7
2169-3536
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Bingfeng Chen100.34
Zhifeng Hao265378.36
Xiaofeng Cai371.82
Ruichu Cai424137.07
Wen Wen583.51
Jian Zhu600.34
Guangqiang Xie702.03