Title
Towards Evaluating the Impact of Anaphora Resolution on Text Summarisation from a Human Perspective.
Abstract
Automatic Text Summarisation (TS) is the process of abstracting key content from information sources. Previous research attempted to combine diverse NLP techniques to improve the quality of the produced summaries. The study reported in this paper seeks to establish whether Anaphora Resolution (AR) can improve the quality of generated summaries, and to assess whether AR has the same impact on text from different subject domains. Summarisation evaluation is critical to the development of automatic summarisation systems. Previous studies have evaluated their summaries using automatic techniques. However, automatic techniques lack the ability to evaluate certain factors which are better quantified by human beings. In this paper the summaries are evaluated via human judgment, where the following factors are taken into consideration: informativeness, readability and understandability, conciseness, and the overall quality of the summary. Overall, the results of this study depict a pattern of slight but not significant increases in the quality of summaries produced using AR. At a subject domain level, however, the results demonstrate that the contribution of AR towards TS is domain dependent and for some domains it has a statistically significant impact on TS.
Year
DOI
Venue
2016
10.1007/978-3-319-41754-7_16
Lecture Notes in Computer Science
Keywords
Field
DocType
Text summarisation,Anaphora resolution,TextRank
Data mining,Information retrieval,Computer science,Readability,Human judgment,Artificial intelligence,Natural language processing
Conference
Volume
ISSN
Citations 
9612
0302-9743
1
PageRank 
References 
Authors
0.35
11
6
Name
Order
Citations
PageRank
Mostafa Bayomi152.45
Killian Levacher2125.09
M. Rami Ghorab3698.08
Peter Lavin431.06
Alexander O'connor518414.32
Séamus Lawless611130.18