Teaching Analytics: Towards Automatic Extraction of Orchestration Graphs Using Wearable Sensors

"Teaching analytics" is the application of learning analytics techniques to understand teaching and learning processes, and eventually enable supportive interventions. However, in the case of (often, half-improvised) teaching in face-to-face classrooms, such interventions would require first an understanding of what the teacher actually did, as the starting point for teacher reflection and inquiry. Currently, such teacher enactment characterization requires costly manual coding by researchers. This paper presents a case study exploring the potential of machine learning techniques to automatically extract teaching actions during classroom enactment, from five data sources collected using wearable sensors (eye-tracking, EEG, accelerometer, audio and video). Our results highlight the feasibility of this approach, with high levels of accuracy in determining the social plane of interaction (90%, k=0.8). The reliable detection of concrete teaching activity (e.g., explanation vs. questioning) accurately still remains challenging (67%, k=0.56), a fact that will prompt further research on multimodal features and models for teaching activity extraction, as well as the collection of a larger multimodal dataset to improve the accuracy and generalizability of these methods.

Published in:
Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, 148-157
Presented at:
International Learning Analytics and Knowledge, Edinburgh, UK, April 25-26, 2016
New York, Assoc Computing Machinery

 Record created 2016-02-19, last modified 2019-08-12

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)