Title
Recurrent networks with attention and convolutional networks for sentence representation and classification.
Abstract
In this paper, we propose a bi-attention, a multi-layer attention and an attention mechanism and convolution neural network based text representation and classification model (ACNN). The bi-attention have two attention mechanism to learn two context vectors, forward RNN with attention to learn forward context vector \(\overrightarrow {\mathbf {c}}\) and backward RNN with attention to learn backward context vector \(\overleftarrow {\mathbf {c}}\), and then concatenation \(\overrightarrow {\mathbf {c}}\) and \(\overleftarrow {\mathbf {c}}\) to get context vector c. The multi-layer attention is the stack of the bi-attention. In the ACNN, the context vector c is obtained by the bi-attention, then the convolution operation is performed on the context vector c, and the max-pooling operation is used to reduce the dimension. After max-pooling operation the text is converted to low-dimensional sentence vector m. Finally, the Softmax classifier be used for text classification. We test our model on 8 benchmarks text classification datasets, and our model achieved a better or the same performance compare with the state-of-the-art methods.
Year
DOI
Venue
2018
10.1007/s10489-018-1176-4
Appl. Intell.
Keywords
Field
DocType
Natural language processing,Deep neural networks,Attention mechanism,Representation learning,Text classification
Softmax function,Pattern recognition,Convolutional neural network,Convolution,Computer science,Concatenation,Artificial intelligence,Classifier (linguistics),Sentence,Deep neural networks,Feature learning
Journal
Volume
Issue
ISSN
48
10
0924-669X
Citations 
PageRank 
References 
4
0.41
29
Authors
4
Name
Order
Citations
PageRank
Tengfei Liu1927.09
Shuangyuan Yu251.14
Baomin Xu3667.66
Hongfeng Yin440.41