Abstract | ||
---|---|---|
We propose a novel neural attention architecture to tackle machine comprehension tasks, such as answering Cloze-style queries with respect to a document. Unlike previous models, we do not collapse the query into a single vector, instead we deploy an iterative alternating attention mechanism that allows a fine-grained exploration of both the query and the document. Our model outperforms state-of-the-art baselines in standard machine comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) dataset. |
Year | Venue | DocType |
---|---|---|
2016 | CoRR | Journal |
Volume | Citations | PageRank |
abs/1606.02245 | 23 | 1.05 |
References | Authors | |
12 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Alessandro Sordoni | 1 | 801 | 38.18 |
Phillip Bachman | 2 | 23 | 1.05 |
Yoshua Bengio | 3 | 42677 | 3039.83 |