ERICA: Expert Guidance in Validating Crowd Answers
File version
Accepted Manuscript (AM)
Author(s)
Duong, Chi Thang
Weidlich, Matthias
Aberer, Karl
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
Santiago, CHILE
License
Abstract
Crowdsourcing became an essential tool for a broad range of Web applications. Yet, the wide-ranging levels of expertise of crowd workers as well as the presence of faulty workers call for quality control of the crowdsourcing result. To this end, many crowdsourcing platforms feature a post-processing phase, in which crowd answers are validated by experts. This approach incurs high costs though, since expert input is a scarce resource. To support the expert in the validation process, we present a tool for \emph{ExpeRt guidance In validating Crowd Answers (ERICA)}. It allows us to guide the expert's work by collecting input on the most problematic cases, thereby achieving a set of high quality answers even if the expert does not validate the complete answer set. The tool also supports the task requester in selecting the most cost-efficient allocation of the budget between the expert and the crowd.
Journal Title
Conference Title
SIGIR 2015: PROCEEDINGS OF THE 38TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL
Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
© ACM, 2016. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the 2016 ACM SIGMIS Conference on Computers and People Research, ISBN: 978-1-4503-4203-2, DOI: 10.1145/2890602.2890624
Item Access Status
Note
Access the data
Related item(s)
Subject
Database systems