Title
Autocompletion interfaces make crowd workers slower, but their use promotes response diversity.
Abstract
Creative tasks such as ideation or question proposal are powerful applications of crowdsourcing, yet the quantity of workers available for addressing practical problems is often insufficient. To enable scalable crowdsourcing thus requires gaining all possible efficiency and information from available workers. One option for text-focused tasks is to allow assistive technology, such as an autocompletion user interface (AUI), to help workers input text responses. But support for the efficacy of AUIs is mixed. Here we designed and conducted a randomized experiment where workers were asked to provide short text responses to given questions. Our experimental goal was to determine if an AUI helps workers respond more quickly and with improved consistency by mitigating typos and misspellings. Surprisingly, we found that neither occurred: workers assigned to the AUI treatment were slower than those assigned to the non-AUI control and their responses were more diverse, not less, than those of the control. Both the lexical and semantic diversities of responses were higher, with the latter measured using word2vec. A crowdsourcer interested in worker speed may want to avoid using an AUI, but using an AUI to boost response diversity may be valuable to crowdsourcers interested in receiving as much novel information from workers as possible.
Year
DOI
Venue
2017
10.15346/hc.v6i1.3
arXiv: Human-Computer Interaction
Field
DocType
Volume
Ideation,Randomized experiment,Crowdsourcing,Computer science,Human–computer interaction,User interface,Multimedia,Scalability
Journal
abs/1707.06939
Issue
ISSN
Citations 
1
Human Computation 6:1:42-55 (2019)
0
PageRank 
References 
Authors
0.34
1
2
Name
Order
Citations
PageRank
Xipei Liu100.34
James P. Bagrow228126.25