Résumé

Although it is frequently claimed that learning analytics can improve self-evaluation and self-regulated learning by students, most learning analytics tools appear to have been developed as a response to existing data rather than with a clear pedagogical model. As a result there is little evidence of impact on learning. Even fewer learning analytics tools seem to be informed by an understanding of the social context and social practices within which they would be used. As a result, there is very little evidence that learning analytics tools are actually impacting on practice. This paper draws on research in self-regulated learning and in the social practices of learning and assessment to clarify a series of design issues which should be considered by those seeking to develop learning analytics tools which are intended to improve student self-evaluation and self-regulation. It presents a case study of how these design issues influenced the development of a particular tool: the Learning Companion.

Détails