Incentives for Effort in Crowdsourcing using the Peer Truth Serum

Crowdsourcing is widely proposed as a method to solve large variety of judgement tasks, such as classifying website content, peer grading in online courses, or collecting real-world data. As the data reported by workers cannot be verified, there is a tendency to report random data without actually solving the task. This can be countered by making the reward for an answer depend on its consistency with answers given by other workers, an approach called {\em peer consistency}. However, it is obvious that the best strategy in such schemes is for all workers to report the same answer without solving the task. Dasgupta and Ghosh (WWW 2013) show that in some cases exerting high effort can be encouraged in the highest-paying equilibrium. In this paper we present a general mechanism that implements this idea and is applicable to most crowdsourcing settings. Furthermore, we experimentally test the novel mechanism, and validate its theoretical properties.

Published in:
ACM Transactions on Intelligent Systems and Technology, 7, 4, 48
New York, Assoc Computing Machinery

 Record created 2016-02-10, last modified 2018-03-17

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)