Title
Syntax-Type-Aware Graph Convolutional Networks For Natural Language Understanding
Abstract
The structure of a sentence conveys rich linguistic knowledge and has proven useful for natural language understanding. In this paper, we aim to incorporate syntactical constraints and long-range word dependencies into the sentence encoding procedure using the widely applied Graph Convolutional Network (GCN) and word dependency trees. Existing syntax-aware GCN methods construct the adjacency matrix by referring to whether two words are connected in the dependency tree. But they fail to model the word dependency type, which reflects how the words are linked in dependency trees. They cannot distinguish the different contributions of different word dependency paths. To avoid introducing redundant word dependencies that harm language understanding, we propose a GCN version that is extended by a novel Word Dependency Gate mechanism. Word Dependency Gate can adaptively maintain the balance between the inclusion and exclusion of specific word dependency paths based on the word dependency type and its word context. Experiments show that our approach can effectively incorporate the relevant syntactical dependency in BERT and achieve a state-of-the-art performance in the End-to-End Aspect-Based Sentiment Analysis and Relation Triple Extraction tasks. (C) 2021 Elsevier B.V. All rights reserved.
Year
DOI
Venue
2021
10.1016/j.asoc.2021.107080
APPLIED SOFT COMPUTING
Keywords
DocType
Volume
Sentiment analysis, Relation extraction, GCN, BERT
Journal
102
ISSN
Citations 
PageRank 
1568-4946
1
0.35
References 
Authors
0
5
Name
Order
Citations
PageRank
Chunning Du131.39
J. Wang247995.23
Haifeng Sun36827.77
Qi Qi421056.01
Jianxin Liao545782.08