Abstract

Many factors influence learners' performance on an activity beyond the knowledge required. Learners' on-task effort has been acknowledged for strongly relating to their educational outcomes, reflecting how actively they are engaged in that activity. However, effort is not directly observable. Multimodal data can provide additional insights into the learning processes and may allow for effort estimation. This paper presents an approach for the classification of effort in an adaptive assessment context. Specifically, the behaviour of 32 students was captured during an adaptive self-assessment activity, using logs and physiological data (i.e., eye-tracking, EEG, wristband and facial expressions). We applied k-means to the multimodal data to cluster students' behavioural patterns. Next, we predicted students' effort to complete the upcoming task, based on the discovered behavioural patterns using a combination of Hidden Markov Models (HMMs) and the Viterbi algorithm. We also compared the results with other state-of-the-art classification algorithms (SVM, Random Forest). Our findings provide evidence that HMMs can encode the relationship between effort and behaviour (captured by the multimodal data) in a more efficient way than the other methods. Foremost, a practical implication of the approach is that the derived HMMs also pinpoint the moments to provide preventive/prescriptive feedback to the learners in real-time, by building-upon the relationship between behavioural patterns and the effort the learners are putting in.

Details