Title
Fusion of heterogeneous attention mechanisms in multi-view convolutional neural network for text classification
Abstract
The rapid proliferation of user generated content has given rise to large volumes of text corpora. Increasingly, scholars, researchers, and organizations employ text classification to mine novel insights for high-impact applications. Despite their prevalence, conventional text classification methods rely on labor-intensive feature engineering efforts that are task specific, omit long-term relationships, and are not suitable for the rapidly evolving domains. While an increasing body of deep learning and attention mechanism literature aim to address these issues, extant methods often represent text as a single view and omit multiple sets of features at varying levels of granularity. Recognizing that these issues often result in performance degradations, we propose a novel Spatial View Attention Convolutional Neural Network (SVA-CNN). SVA-CNN leverages an innovative and carefully designed set of multi-view representation learning, a combination of heterogeneous attention mechanisms and CNN-based operations to automatically extract and weight multiple granularities and fine-grained representations. Rigorously evaluating SVA-CNN against prevailing text classification methods on five large-scale benchmark datasets indicates its ability to outperform extant deep learning-based classification methods in both performance and training time for document classification, sentiment analysis, and thematic identification applications. To facilitate model reproducibility and extensions, SVA-CNN’s source code is also available via GitHub.
Year
DOI
Venue
2021
10.1016/j.ins.2020.10.021
Information Sciences
Keywords
DocType
Volume
View attention,Spatial attention,Multi-view representation,Series and parallel connection,Conventional neural network,Text classification
Journal
548
ISSN
Citations 
PageRank 
0020-0255
0
0.34
References 
Authors
18
7
Name
Order
Citations
PageRank
Yunji Liang1918.61
Huihui Li200.34
Bin Guo39821.15
Zhiwen Yu42753220.67
Xiaolong Zheng533.07
Sagar Samtani642.42
Daniel Zeng72539286.59