Title
Incremental sentence compression using LSTM recurrent networks
Abstract
Many of the current sentence compression techniques attempt to produce a shortened form of a sentence by relying on syntactic structure such as dependency tree representations. While the performance of sentence compression has been improving, these approaches require a full parse of the sentence before performing sentence compression, making it difficult to perform compression in real time. In this paper, we examine the possibilities of performing incremental sentence compression using long short-term memory (LSTM) recurrent neural networks (RNN). The decision of whether to remove a word is done at each time step, without waiting for the end of the sentence. Various RNN parameters are investigated, including the number of layers and network connections. Furthermore, we also propose using a pretraining method in which the network is pretrained as an autoencoder. Experimental results reveal that our method obtains compression rates similar to human references and a better accuracy than the state-of-the-art tree transduction models.
Year
DOI
Venue
2015
10.1109/ASRU.2015.7404802
2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU)
Keywords
Field
DocType
Sentence compression,recurrent neural network,long short term memory
Dependency tree,Autoencoder,Computer science,Recurrent neural network,Long short term memory,Speech recognition,Sentence compression,Natural language processing,Artificial intelligence,Parsing,Sentence,Syntactic structure
Conference
Citations 
PageRank 
References 
0
0.34
21
Authors
6
Name
Order
Citations
PageRank
Sakriani Sakti125765.02
Faiz Ilham200.34
Graham Neubig3989130.31
Tomoki Toda41874167.18
Ayu Purwarianti53813.96
Satoshi Nakamura61099194.59