Title
Measuring Crowdsourcing Effort with Error-Time Curves
Abstract
Crowdsourcing systems lack effective measures of the effort required to complete each task. Without knowing how much time workers need to execute a task well, requesters struggle to accurately structure and price their work. Objective measures of effort could better help workers identify tasks that are worth their time. We propose a data-driven effort metric, ETA (error-time area), that can be used to determine a task's fair price. It empirically models the relationship between time and error rate by manipulating the time that workers have to complete a task. ETA reports the area under the error-time curve as a continuous metric of worker effort. The curve's 10th percentile is also interpretable as the minimum time most workers require to complete the task without error, which can be used to price the task. We validate the ETA metric on ten common crowdsourcing tasks, including tagging, transcription, and search, and find that ETA closely tracks how workers would rank these tasks by effort. We also demonstrate how ETA allows requesters to rapidly iterate on task designs and measure whether the changes improve worker efficiency. Our findings can facilitate the process of designing, pricing, and allocating crowdsourcing tasks.
Year
DOI
Venue
2015
10.1145/2702123.2702145
CHI
Keywords
Field
DocType
crowdsourcing,microtasks,miscellaneous,task effort
Data mining,Computer science,Crowdsourcing,Word error rate,Fair value,Minimum time,Percentile
Conference
Citations 
PageRank 
References 
15
0.55
14
Authors
3
Name
Order
Citations
PageRank
Justin Cheng179934.10
Jaime Teevan24041209.15
Michael S. Bernstein38604393.80