Abstract

Recent advantages from computational linguists can be leveraged to nudge students with adaptive self-evaluation based on their argumentation skill level. To investigate how individual argumentation self-evaluation will help students write more convincing texts, we designed an intelligent argumentation writing support system called ArgumentFeedback based on nudging theory and evaluated it in a series of three qualitative and quantitative studies with a total of 83 students. We found that students who received a self-evaluation nudge wrote more convincing texts with a better quality of formal and perceived argumentation compared to the control group. The measured self-efficacy and the technology acceptance provide promising results for embedding adaptive argumentation writing support tools in combination with digital nudging in traditional learning settings to foster self-regulated learning. Our results indicate that the design of nudging-based learning applications for self-regulated learning combined with computational methods for argumentation self-evaluation has a beneficial use to foster better writing skills of students.

Details