Title
Iterative Alternating Neural Attention for Machine Reading.
Abstract
We propose a novel neural attention architecture to tackle machine comprehension tasks, such as answering Cloze-style queries with respect to a document. Unlike previous models, we do not collapse the query into a single vector, instead we deploy an iterative alternating attention mechanism that allows a fine-grained exploration of both the query and the document. Our model outperforms state-of-the-art baselines in standard machine comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) dataset.
Year
Venue
DocType
2016
CoRR
Journal
Volume
Citations 
PageRank 
abs/1606.02245
23
1.05
References 
Authors
12
3
Name
Order
Citations
PageRank
Alessandro Sordoni180138.18
Phillip Bachman2231.05
Yoshua Bengio3426773039.83