Abstract | ||
---|---|---|
Attention based recurrent neural networks have shown advantages in representing natural language sentences (Hermann et al., 2015; Rocktaschel et al., 2015; Tan et al., 2015). Based on recurrent neural networks (RNN), external attention information was added to hidden representations to get an attentive sentence representation. Despite the improvement over nonattentive models, the attention mechanism under RNN is not well studied. In this work, we analyze the deficiency of traditional attention based RNN models quantitatively and qualitatively. Then we present three new RNN models that add attention information before RNN hidden representation, which shows advantage in representing sentence and achieves new state-of-art results in answer selection task. |
Year | DOI | Venue |
---|---|---|
2016 | 10.18653/v1/p16-1122 | PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1 |
DocType | Volume | Citations |
Conference | P16-1 | 32 |
PageRank | References | Authors |
0.94 | 14 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Bingning Wang | 1 | 49 | 4.26 |
Kang Liu | 2 | 1542 | 89.33 |
Jun Zhao | 3 | 2119 | 115.52 |