Title
Inner Attention Based Recurrent Neural Networks For Answer Selection
Abstract
Attention based recurrent neural networks have shown advantages in representing natural language sentences (Hermann et al., 2015; Rocktaschel et al., 2015; Tan et al., 2015). Based on recurrent neural networks (RNN), external attention information was added to hidden representations to get an attentive sentence representation. Despite the improvement over nonattentive models, the attention mechanism under RNN is not well studied. In this work, we analyze the deficiency of traditional attention based RNN models quantitatively and qualitatively. Then we present three new RNN models that add attention information before RNN hidden representation, which shows advantage in representing sentence and achieves new state-of-art results in answer selection task.
Year
DOI
Venue
2016
10.18653/v1/p16-1122
PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1
DocType
Volume
Citations 
Conference
P16-1
32
PageRank 
References 
Authors
0.94
14
3
Name
Order
Citations
PageRank
Bingning Wang1494.26
Kang Liu2154289.33
Jun Zhao32119115.52