Title | ||
---|---|---|
Understanding and Mitigating Worker Biases in the Crowdsourced Collection of Subjective Judgments. |
Abstract | ||
---|---|---|
Crowdsourced data acquired from tasks that comprise a subjective component (e.g. opinion detection, sentiment analysis) is potentially affected by the inherent bias of crowd workers who contribute to the tasks. This can lead to biased and noisy ground-truth data, propagating the undesirable bias and noise when used in turn to train machine learning models or evaluate systems. In this work, we aim to understand the influence of workers' own opinions on their performance in the subjective task of bias detection. We analyze the influence of workers' opinions on their annotations corresponding to different topics. Our findings reveal that workers with strong opinions tend to produce biased annotations. We show that such bias can be mitigated to improve the overall quality of the data collected. Experienced crowd workers also fail to distance themselves from their own opinions to provide unbiased annotations.
|
Year | DOI | Venue |
---|---|---|
2019 | 10.1145/3290605.3300637 | CHI |
Keywords | Field | DocType |
bias, crowdsourcing, microtasks, workers | Crowdsourcing,Sentiment analysis,Computer science,Human–computer interaction,Opinion detection | Conference |
ISBN | Citations | PageRank |
978-1-4503-5970-2 | 2 | 0.37 |
References | Authors | |
0 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Christoph Hube | 1 | 2 | 2.73 |
Besnik Fetahu | 2 | 148 | 19.26 |
Ujwal Gadiraju | 3 | 69 | 8.42 |