Options
A First Step in Using Machine Learning Methods to Enhance Interaction Analysis for Embodied Learning Environments
Journal
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ISSN
03029743
Date Issued
2024-01-01
Author(s)
Fonteles, Joyce
Davalos, Eduardo
Zhang, Yike
Zhou, Mengxi
Ayalon, Efrat
Lane, Alicia
Steinberg, Selena
Anton, Gabriella
Danish, Joshua
Enyedy, Noel
Biswas, Gautam
Abstract
Investigating children’s embodied learning in mixed-reality environments, where they collaboratively simulate scientific processes, requires analyzing complex multimodal data to interpret their learning and coordination behaviors. Learning scientists have developed Interaction Analysis (IA) methodologies for analyzing such data, but this requires researchers to watch hours of videos to extract and interpret students’ learning patterns. Our study aims to simplify researchers’ tasks, using Machine Learning and Multimodal Learning Analytics to support the IA processes. Our study combines machine learning algorithms and multimodal analyses to support and streamline researcher efforts in developing a comprehensive understanding of students’ scientific engagement through their movements, gaze, and affective responses in a simulated scenario. To facilitate an effective researcher-AI partnership, we present an initial case study to determine the feasibility of visually representing students’ states, actions, gaze, affect, and movement on a timeline. Our case study focuses on a specific science scenario where students learn about photosynthesis. The timeline allows us to investigate the alignment of critical learning moments identified by multimodal and interaction analysis, and uncover insights into students’ temporal learning progressions.
Volume
14830 LNAI
Subjects