Title
CrowdTerrier: automatic crowdsourced relevance assessments with terrier
Abstract
In this demo, we present CrowdTerrier, an infrastructure extension to the open source Terrier IR platform that enables the semi-automatic generation of relevance assessments for a variety of document ranking tasks using crowdsourcing. The aim of CrowdTerrier is to reduce the time and expertise required to effectively Crowdsource relevance assessments by abstracting away from the complexities of the crowdsourcing process. It achieves this by automating the assessment process as much as possible, via a close integration of the IR system that ranks the documents (Terrier) and the crowdsourcing marketplace that is used to assess those documents (Amazon's Mechanical Turk).
Year
DOI
Venue
2012
10.1145/2348283.2348430
SIGIR
Keywords
Field
DocType
mechanical turk,document ranking task,crowdsourcing marketplace,close integration,crowdsourcing process,assessment process,crowdsource relevance assessment,automatic crowdsourced relevance assessment,terrier ir platform,relevance assessment,ir system,crowdsourcing
Data science,Data mining,World Wide Web,Ranking,Computer science,Crowdsourcing,Crowdsource
Conference
Citations 
PageRank 
References 
1
0.35
1
Authors
3
Name
Order
Citations
PageRank
Richard Mccreadie140332.43
Craig Macdonald22588178.50
Iadh Ounis33438234.59