Title
RABERT: Relation-Aware BERT for Target-Oriented Opinion Words Extraction
Abstract
BSTRACTTargeted Opinion Word Extraction (TOWE) is a subtask of aspect-based sentiment analysis, which aims to identify the correspondingopinion terms for given opinion targets in a review. To solve theTOWE task, recent works mainly focus on learning the target-aware context representation that infuses target information intocontext representation by using various neural networks. However,it has been unclear how to encode the target information to BERT,a powerful pre-trained language model. In this paper, we proposea novel TOWE model, RABERT (Relation-Aware BERT), that canfully utilize BERT to obtain target-aware context representations.To introduce the target information into BERT layers clearly, wedesign a simple but effective encoding method that adds targetmarkers indicating the opinion targets to the sentence. In addi-tion, we find that the neighbor word information is also importantfor extracting the opinion terms. Therefore, RABERT employs thetarget-sentence relation network and the neighbor-aware relationnetwork to consider both the opinion target and the neighbor wordsinformation. Our experimental results on four benchmark datasetsshow that RABERT significantly outperforms the other baselinesand achieves state-of-the-art performance. We also demonstrate theeffectiveness of each component of RABERT in further analysi
Year
DOI
Venue
2021
10.1145/3459637.3482165
Conference on Information and Knowledge Management
DocType
Citations 
PageRank 
Conference
1
0.40
References 
Authors
0
4
Name
Order
Citations
PageRank
Taegwan Kang111.07
Minwoo Lee210.40
Nakyeong Yang310.40
Kyomin Jung439437.38