Peer knowledge modeling in computer supported collaborative learning

Learners benefit from collaboration because it triggers effective interaction processes such as externalization, elicitation and negotiation of knowledge. In order to communicate effectively, learners need to have a certain representation of their peers' knowledge. We refer to the process of building and maintaining a representation of the peers' knowledge as peer knowledge modeling. The present thesis aim at making contributions to the fields of computer support collaborative learning and work (CSCL, CSCW) on three main levels. First, as an empirical contribution, we investigate the process of peer knowledge modeling in the context of CSCL. Our main research question inquires the effects of a socio-cognitive support, providing co-learners with cues about their peer's prior knowledge, on collaborative learning outcomes and processes. In an empirical study (the KAT experiment), university students (N=64) participated in a remote computer-mediated dyadic learning scenario. They were provided (or not) with a visual representation of their partner's prior-knowledge level through a Knowledge Awareness Tool (KAT). Results showed that the KAT enhances co-learners' collaborative learning gain. This effect appears to be mediated by the positive effect of the KAT on participants' accuracy in estimating their peer's knowledge. Analyses on the process level showed that participants of the KAT condition produce more elaborated utterances. KAT condition dyads' interactions are more focused on knowledge negotiation, whereas the control condition dyads are mainly focused on task completion. The KAT seems to provide a sensitizing metacognitive support, structuring and regulating the collaboration by helping co-learners to cope with their knowledge gaps and discrepancies. Second, as a methodological contribution, we examine the affordance of dual eye tracking techniques as an innovative methodology to investigate, on a deeper level, the socio-cognitive processes underlying collaboration. We introduce DUET (DUal Eye-Tracking), a method using a multimodal technique to collect rich data featuring peers' synchronized gaze patterns, verbal interaction and potentially activities. We examine the main research applications of DUET and exemplify them with analyses conducted in the context of the KAT experiment. DUET method appears to be a promising technique to investigate collaborative processes on a deep level. Finally, as a third computational contribution, we built upon micro-level analyses of the verbal referencing process to introduce and test REGARD, a computational model allowing to automatically detect verbal references and locate the specific object of reference. The results of the test show a reasonably good accuracy of the REGARD algorithm to detect and associate verbal references to objects of reference. We discuss the design and research application of the REGARD model for the field of CSCL, CSCW and more generally HCI.


Related material