Title
Salience Bias in Crowdsourcing Contests
Abstract
AbstractCrowdsourcing relies on online platforms to connect a community of users to perform specific tasks. However, without appropriate control, the behavior of the online community might not align with the platform's designed objective, which can lead to an inferior platform performance. This paper investigates how the feedback information on a crowdsourcing platform and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, using archival data from the online crowdsourcing platform Kaggle, combined with survey data from actual Kaggle contest participants, we examine the role of a systematic bias, namely, the salience bias, in influencing the performance of the crowdsourcing workers and how the number of crowdsourcing workers moderates the impact of the salience bias on the outcomes of contests. Our results suggest that the salience bias influences the performance of contestants, including the winners of the contests. Furthermore, the number of participating contestants may attenuate or amplify the impact of the salience bias on the outcomes of contests, depending on the effort required to complete the tasks. Our results have critical implications for crowdsourcing firms and platform designers.The online appendix is available at https://doi.org/10.1287/isre.2018.0775.
Year
DOI
Venue
2018
10.1287/isre.2018.0775
Periodicals
Keywords
DocType
Volume
behavioral economics, crowdsourcing, open innovation, salience bias, parallel path effect, competition effect
Journal
29
Issue
ISSN
Citations 
2
1526-5536
2
PageRank 
References 
Authors
0.38
16
4
Name
Order
Citations
PageRank
Ho Cheung Brian Lee141.09
Sulin Ba21402133.29
Xinxin Li3278.16
Jan Stallaert457273.86