Title
Be Consistent! Improving Procedural Text Comprehension using Label Consistency
Abstract
Our goal is procedural text comprehension, namely tracking how the properties of entities (e.g., their location) change with time given a procedural text (e.g., a paragraph about photosynthesis, a recipe). This task is challenging as the world is changing throughout the text, and despite recent advances, current systems still struggle with this task. Our approach is to leverage the fact that, for many procedural texts, multiple independent descriptions are readily available, and that predictions from them should be consistent (label consistency). We present a new learning framework that leverages label consistency during training, allowing consistency bias to be built into the model. Evaluation on a standard benchmark dataset for procedural text, ProPara (Dalvi et al., 2018), shows that our approach significantly improves prediction performance (F1) over prior state-of-the-art systems.
Year
Venue
Field
2019
north american chapter of the association for computational linguistics
Computer science,Text comprehension,Artificial intelligence,Natural language processing
DocType
Citations 
PageRank 
Journal
0
0.34
References 
Authors
0
7
Name
Order
Citations
PageRank
Xinya Du1103.55
Bhavana Bharat Dalvi220117.31
Niket Tandon314617.32
Antoine Bosselut4496.11
Wen-tau Yih53238204.01
Peter Clark678072.67
Claire Cardie75591555.20