Title
In Search of Ambiguity: A Three-Stage Workflow Design to Clarify Annotation Guidelines for Crowd Workers.
Abstract
We propose a novel three-stage FIND-RESOLVE-LABEL workflow for crowdsourced annotation to reduce ambiguity in task instructions and, thus, improve annotation quality. Stage 1 (FIND) asks the crowd to find examples whose correct label seems ambiguous given task instructions. Workers are also asked to provide a short tag that describes the ambiguous concept embodied by the specific instance found. We compare collaborative vs. non-collaborative designs for this stage. In Stage 2 (RESOLVE), the requester selects one or more of these ambiguous examples to label (resolving ambiguity). The new label(s) are automatically injected back into task instructions in order to improve clarity. Finally, in Stage 3 (LABEL), workers perform the actual annotation using the revised guidelines with clarifying examples. We compare three designs using these examples: examples only, tags only, or both. We report image labeling experiments over six task designs using Amazon's Mechanical Turk. Results show improved annotation accuracy and further insights regarding effective design for crowdsourced annotation tasks.
Year
DOI
Venue
2022
10.3389/frai.2022.828187
Frontiers in Artificial Intelligence
Keywords
DocType
Volume
ambiguity,annotation,artificial intelligence,clarification,crowdsourcing,guidelines,labeling,machine learning
Journal
5
ISSN
Citations 
PageRank 
2624-8212
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Vivek Krishna Pradhan100.34
Mike Schaekermann200.34
Matthew Lease300.34