Files

Abstract

Our user feedback framework requires some robust techniques in order to tackle the scalability issue of schema matching network. One approach is employing crowd-sourcing/human computation models. Crowdsourcing is one of cutting-edge research areas which involves human computers to perform pre-defined tasks. In this literal review, we try to explore some certain concepts such as task, work-flow, feedback aggregation, quality control and reward system. We show that there are a lot of aspects which can be integrated into our user feedback framework.

Details

Actions

Preview