Title
Is sentence compression an NLG task?
Abstract
Data-driven approaches to sentence compression define the task as dropping any subset of words from the input sentence while retaining important information and grammaticality. We show that only 16% of the observed compressed sentences in the domain of subtitling can be accounted for in this way. We argue that part of this is due to evaluation issues and estimate that a deletion model is in fact compatible with approximately 55% of the observed data. We analyse the remaining problems and conclude that in those cases word order changes and paraphrasing are crucial, and argue for more elaborate sentence compression models which build on NLG work.
Year
Venue
Keywords
2009
natural language generation
observed data,input sentence,elaborate sentence compression model,nlg task,evaluation issue,deletion model,cases word order change,remaining problem,nlg work,important information,word order
DocType
Citations 
PageRank 
Conference
2
0.39
References 
Authors
16
4
Name
Order
Citations
PageRank
Erwin Marsi154346.13
Emiel Krahmer2866110.30
Iris Hendrickx328530.91
Walter Daelemans42019269.73