Emotion recognition from multiple modalities: Fundamentals and methodologies
Humans are emotional creatures. Multiple modalities are often involved when we express
emotions, whether we do so explicitly (such as through facial expression and speech) or …
emotions, whether we do so explicitly (such as through facial expression and speech) or …
Building pipelines for educational data using AI and multimodal analytics: A “grey‐box” approach
K Sharma, Z Papamitsiou… - British Journal of …, 2019 - Wiley Online Library
Students' on‐task engagement during adaptive learning activities has a significant effect on
their performance, and at the same time, how these activities influence students' behavior is …
their performance, and at the same time, how these activities influence students' behavior is …
Automated gaze-based mind wandering detection during computerized learning in classrooms
We investigate the use of commercial off-the-shelf (COTS) eye-trackers to automatically
detect mind wandering—a phenomenon involving a shift in attention from task-related to …
detect mind wandering—a phenomenon involving a shift in attention from task-related to …
Utilizing multimodal data through fsQCA to explain engagement in adaptive learning
Investigating and explaining the patterns of learners' engagement in adaptive learning
conditions is a core issue towards improving the quality of personalized learning services …
conditions is a core issue towards improving the quality of personalized learning services …
[PDF][PDF] Modeling educational discourse with natural language processing
N Dowell, V Kovanovic - Education, 2022 - solaresearch.org
The broadening adoption of technology enhanced learning environments has substantially
altered the manner in which educational communication takes place, with most people …
altered the manner in which educational communication takes place, with most people …
Time to scale: Generalizable affect detection for tens of thousands of students across an entire school year
We developed generalizable affect detectors using 133,966 instances of 18 affective states
collected from 69,174 students who interacted with an online math learning platform called …
collected from 69,174 students who interacted with an online math learning platform called …
Dyadic affect in parent-child multimodal interaction: Introducing the dami-p2c dataset and its preliminary analysis
H Chen, S Alghowinem, SJ Jang… - IEEE Transactions …, 2022 - ieeexplore.ieee.org
High-quality parent-child conversational interactions are crucial for children's social,
emotional, and cognitive development. However, many children have limited exposure to …
emotional, and cognitive development. However, many children have limited exposure to …
Multimodal user state and trait recognition: An overview
B Schuller - The Handbook of Multimodal-Multisensor Interfaces …, 2018 - dl.acm.org
It seems intuitive, if not obvious, that for intelligent interaction and communication between
technical systems and human users the knowledge of the user states and traits (for …
technical systems and human users the knowledge of the user states and traits (for …
Affect Behavior Prediction: Using Transformers and Timing Information to Make Early Predictions of Student Exercise Outcome
Early prediction of student outcomes, as they practice with an intelligent tutoring system is
crucial for providing timely and effective interventions to students, potentially improving their …
crucial for providing timely and effective interventions to students, potentially improving their …
Multimodal interaction, interfaces, and analytics
S Oviatt - Handbook of Human Computer Interaction, 2022 - Springer
Multimodal-multisensor interfaces have coevolved rapidly with the emergence of mobile
devices (eg, smart phones), and they are now the dominant computer interface worldwide …
devices (eg, smart phones), and they are now the dominant computer interface worldwide …