I am privileged to be part of the LEADS partnership, a research network led by Susanne Lajoie at McGill University. This paperthe first coming from our subproject – with a rather long title “Systematic Evaluation of the Effectiveness of TREs through Software Platform Development for Data Mining across Multiple Disciplines and Tracking Changes in Affective and Cognitive Growths”
J. M. Harley, F. Bouchet, S.Hussain, R. Azevedo, R. Calvo; A Multi-Componential Analysis of Emotions during Complex Learning with an Intelligent Multi-Agent System; AERA2014 Symposium: Interdisciplinary Approaches for Analysing Data from Multiple Affective Channels with Computer-Based Learning Environments.
Abstract. In this paper we discuss the methodology and results of aligning three different emotional measurement methods (automatic facial expression recognition, self-report, electrodermal activation) and their agreement regarding learners’ emotions. Data was collected from 67 undergraduate students from a North American university who interacted with MetaTutor, an intelligent, multi-agent, hypermedia environment for learning about the human circulatory system, for a 1 hour learning session (Azevedo et al., 2013, Harley, Bouchet, & Azevedo, 2013). A webcam was used to capture videos of learners’ facial expressions, which were analyzed using automatic facial recognition software (FaceReader 5.0). Learners’ physiological arousal was measured using Affectiva’s Q-Sensor 2.0 electrodermal activation bracelet. Learners self-reported their experience of 19 different emotional states (including basic, learner-centered, and academic achievement emotions) using the Emotion-Value questionnaire (Harley et al., 2013). They did so on five different occasions during the learning session, which were used as markers to align data from FaceReader and Q-Sensor. We found a high agreement between the facial and self-report data (75.6%) when similar emotions were grouped together along theoretical dimensions and definitions (e.g., anger and frustration) (Harley, et al., 2013). However, our new results examining the agreement between the Q-Sensor and these two methods suggests that electrodermal (EDA/physiological) indices of emotions do not have a tightly coupled (Gross, Sheppes, & Urry, 2011) relationship with them. Explanations for this finding are discussed.
Read the Full paper