Title
Position-Aware Hierarchical Transfer Model for Aspect-Level Sentiment Classification
Abstract
Recently, attention-based neural networks (NNs) have been widely used for aspect-level sentiment classification (ASC). Most neural models focus on incorporating the aspect representation into attention, however, the position information of each aspect is not studied well. Furthermore, the existing ASC datasets are relatively small owing to the labor-intensive labeling that largely limits the performance of NNs. In this paper, we propose a position-aware hierarchical transfer (PAHT) model that models the position information from multiple levels and enhances the ASC performance by transferring hierarchical knowledge from the resource-rich sentence-level sentiment classification (SSC) dataset. We first present aspect-based positional attention in the word and the segment levels to capture more salient information toward a given aspect. To make up for the limited data for ASC, we devise three sampling strategies to select related instances from the large-scale SSC dataset for pre-training and transfer the learned knowledge into ASC from four levels: embedding, word, segment and classifier. Extensive experiments on four benchmark datasets demonstrate that the proposed model is effective in improving the performance of ASC. Particularly, our model outperforms the state-of-the-art approaches in terms of accuracy over all the datasets considered.
Year
DOI
Venue
2020
10.1016/j.ins.2019.11.048
Information Sciences
Keywords
Field
DocType
Aspect-level sentiment classification,Hierarchical attention networks,Neural networks,Transfer learning,Sentiment classification,Position
Embedding,Artificial intelligence,Sampling (statistics),Artificial neural network,Classifier (linguistics),Mathematics,Machine learning,Salient
Journal
Volume
ISSN
Citations 
513
0020-0255
5
PageRank 
References 
Authors
0.57
0
5
Name
Order
Citations
PageRank
Jie Zhou12103190.17
Qin Chen271.34
Xiangji Huang31551159.34
Qinmin Vivian Hu4206.06
Liang He53616.68