Crowdsourcing-based multimedia subjective evaluations: a case study on image recognizability and aesthetic appeal

Research on Quality of Experience (QoE) heavily relies on subjective evaluations of media. An important aspect of QoE concerns modeling and quantifying the subjective notions of `beauty' (aesthetic appeal) and `something well-known' (content recognizability), which are both subject to cultural and social effects. Crowdsourcing, which allows employing people worldwide to perform short and simple tasks via online platforms, can be a great tool for performing subjective studies in a time and cost-effective way. On the other hand, the crowdsourcing environment does not allow for the degree of experimental control which is necessary to guarantee reliable subjective data. To validate the use of crowdsourcing for QoE assessments, in this paper, we evaluate aesthetic appeal and recognizability of images using the Microworkers crowdsourcing platform and compare the outcomes with more conventional evaluations conducted in a controlled lab environment. We find high correlation between crowdsourcing and lab scores for recognizability but not for aesthetic appeal, indicating that crowdsourcing can be used for QoE subjective assessments as long as the workers' tasks are designed with extreme care to avoid misinterpretations.

Presented at:
The 2nd International ACM Workshop on Crowdsourcing for Multimedia, CrowdMM'13, Barcelona, Spain, October 21, 2013

 Record created 2013-09-03, last modified 2018-01-28

External link:
Download fulltext
Publisher's version
Rate this document:

Rate this document:
(Not yet reviewed)