One Code to Predict Them All: Universal Encoding for Inquiry Modeling
Interactive simulations enhance science education and foster inquiry skills, but their open-ended nature can be cognitively overloading. While adaptive systems offer timely support, research on predicting conceptual understanding in these environments is limited. Most models are simulation-specific, leading to time-consuming and non-generalizable solutions. In this paper, we introduce a universal encoding that converts lower-level interaction data into higher-level features applicable across various open-ended learning environments (OELEs). This encoding aims to offer a general framework to model inquiry across environments and to alleviate challenges such as the “cold start” problem. Our findings demonstrate that models trained on the universal encoding perform comparably to or better than study-specific encodings across multiple contexts. Code is provided in https://github.com/epfl-ml4ed/universal-oele.
2-s2.0-105012023148
École Polytechnique Fédérale de Lausanne
École Polytechnique Fédérale de Lausanne
Technion - Israel Institute of Technology
The Royal Institute of Technology (KTH)
École Polytechnique Fédérale de Lausanne
2025
978-3-031-98461-7
978-3-031-98462-4
528
Lecture Notes in Computer Science (LNAI); 15881
1611-3349
0302-9743
60
67
REVIEWED
EPFL
| Event name | Event acronym | Event place | Event date |
Palermo, Italy | 2025-07-22 - 2025-07-26 | ||