Survey of Web-based Crowdsourcing Frameworks for Subjective Quality Assessment

The popularity of the crowdsourcing for performing various tasks online increased significantly in the past few years. The low cost and flexibility of crowdsourcing, in particular, attracted researchers in the field of subjective multimedia evaluations and Quality of Experience (QoE). Since online assessment of multimedia content is challenging, several dedicated frameworks were created to aid in the designing of the tests, including the support of the testing methodologies like ACR, DCR, and PC, setting up the tasks, training sessions, screening of the subjects, and storage of the resulted data. In this paper, we focus on the web-based frameworks for multimedia quality assessments that support commonly used crowdsourcing platforms such as Amazon Mechanical Turk and Microworkers. We provide a detailed overview of the crowdsourcing frameworks and evaluate them to aid researchers in the field of QoE assessment in the selection of frameworks and crowdsourcing platforms that are adequate for their experiments.

Presented at:
16th International Workshop on Multimedia Signal Processing, Jakarta, Indonesia, September 22-24, 2014

 Record created 2014-06-11, last modified 2018-01-28

External link:
Download fulltext
Rate this document:

Rate this document:
(Not yet reviewed)