Title
Long Short-Term Memory-Networks for Machine Reading.
Abstract
Machine reading, the automatic understanding of text, remains a challenging task of great value for NLP applications. We propose a machine reader which processes text incrementally from left to right, while linking the current word to previous words stored in memory and implicitly discovering lexical dependencies facilitating understanding. The reader is equipped with a Long Short-Term Memory architecture, which differs from previous work in that it has a memory tape (instead of a memory cell) for adaptively storing past information without severe information compression. We also integrate our reader with a new attention mechanism in encoder-decoder architecture. Experiments on language modeling, sentiment analysis, and natural language inference show that our model matches or outperforms the state of the art.
Year
DOI
Venue
2016
10.18653/v1/D16-1053
EMNLP
DocType
Volume
Citations 
Conference
abs/1601.06733
106
PageRank 
References 
Authors
3.45
37
3
Search Limit
100106
Name
Order
Citations
PageRank
Jianpeng Cheng123911.13
Li Dong258231.86
Mirella Lapata35973369.52