Detecting Learner Engagement in MOOCs using Automatic
Facial Expression Recognition
Abhilash Dubbaka and Anandha Gopalan
Abstract:
Drop out rates in Massive Open Online Courses(MOOCs) are very high.
Although there are many external factors, such as users not having
enough time or not intending to complete the course, there are some
aspects that instructors can control to optimally engage their students.
To do this, they need to know how students engage throughout their video
lecture. This paper explores the use of webcams to record students'
facial expressions whilst they watched educational video material to
analyse their Learner Engagement levels. Convolutional neural networks
(CNNs) were trained to detect facial action units, which were mapped
onto two psychological measurements, valence(emotional state) and
arousal (attentiveness), using support vector regressions. These valence
and arousal values were combined in a novel manner resulting in Learner
Engagement levels. Moreover,a new approach was used to combine CNNs with
geometric feature-based techniques to improve the performance of the
models. Two experiments were conducted and found that 9 out of 10 CNN
models achieved 95% accuracy on average across the majority of the
subjects, whilst the Learner Engagement detector was able to identify
facial expressions that translated to Learner Engagement levels
successfully. These results suggest that there is promise in this
approach, in that feedback on students' Learner Engagement can be
provided back to the instructor. Additional research should be
undertaken to further prove these results and overcome some limitations
that were faced.