Abstract | ||
---|---|---|
Recurrent Neural Network (RNN) is one of the most popular architectures for addressing variable sequence text, and it shows outstanding results in many Natural Language Processing (NLP) tasks and remarkable performance in capturing long-term dependencies. Many models have achieved excellent results based on RNN. However, most of these models ignored the locations of the crucial words in a sentence and the semantic connections in different directions. Such processing approaches do not make full use of the available information. Thus, we consider that some words have a special effect on the whole sentence, while some words have little influence. To address these problems, in this paper, we propose Bidirectional Gated Recurrent Units (BGRU) that are integrated with a novel attention pooling that combine with max-pooling to pay attention to the crucial words and maintain the more meaningful representation of the text automatically, which allows us to encode longer sequences. It not only prevents important information from being discarded but also can be used to filter noise. We evaluate the proposed model on multiple tasks, including sentiment classification, movie review data and a subjective classification dataset. Our model compares predicted labels with correct labels as accuracy. The experimental results show that our model can achieve excellent performance on these tasks.
|
Year | DOI | Venue |
---|---|---|
2018 | 10.1145/3297156.3297267 | Proceedings of the 2018 2nd International Conference on Computer Science and Artificial Intelligence |
Keywords | Field | DocType |
Natural language processing, Neural Network, Gated Recurrent Units, Text Classification | ENCODE,Filter noise,Convolutional neural network,Computer science,Pooling,Recurrent neural network,Speech recognition,Sentence | Conference |
ISBN | Citations | PageRank |
978-1-4503-6606-9 | 1 | 0.35 |
References | Authors | |
21 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mingbo Hong | 1 | 1 | 0.35 |
Mantao Wang | 2 | 1 | 0.35 |
Lixin Luo | 3 | 232 | 9.10 |
Xuefeng Tan | 4 | 1 | 0.35 |
Dejun Zhang | 5 | 238 | 19.97 |
Yike Lao | 6 | 1 | 0.35 |