Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for Enhancing Intelligent Tutoring Systems



Behera, Ardhendu, Matthew, Peter, Keidel, Alexander, Vangorp, Peter, Fang, Hui and Canning, Susan ORCID: 0000-0002-3589-5671
(2020) Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for Enhancing Intelligent Tutoring Systems. International Journal of Artificial Intelligence in Education.

[img] Text
IJIED_final.pdf - Author Accepted Manuscript
Access to this file is restricted: awaiting official publication and publisher embargo.

Download (6MB)

Abstract

Learning involves a substantial amount of cognitive, social and emotional states. Therefore, recognizing and understanding these states in the context of learning is key in designing informed interventions and addressing the needs of the individual student to provide personalized education. In this paper, we explore the automatic detection of learner’s nonverbal behaviors involving hand-over-face gestures, head and eye movements and emotions via facial expressions during learning. The proposed computer vision-based behavior monitoring method uses a low-cost webcam and can easily be integrated with the modern tutoring technologies. We investigate these behaviors in-depth over time in a classroom session of 40 minutes involving reading and problem-solving exercises. The exercises in the sessions are divided into three categories: an easy, medium and difficult topic within the context of undergraduate computer science. We found that there is a significant increase in head and eye movements as time progresses, as well as with the increase of difficulty level. We demonstrated that there is a considerable occurrence of hand-over-face gestures (on average 21.35%) during the 40 minutes session and is unexplored in the education domain. We propose a novel deep learning approach for automatic detection of hand-over-face gestures in images with a classification accuracy of 86.87%. There is a prominent increase in hand-overface gestures when the difficulty level of the given exercise increases. The handover-face gestures occur more frequently during problem-solving (easy 23.79%, medium 19.84% and difficult 30.46%) exercises in comparison to reading (easy 16.20%, medium 20.06% and difficult 20.18%).

Item Type: Article
Uncontrolled Keywords: Adaptive and intelligent multimedia and hypermedia systems, Intelligent Tutoring Systems (ITS), Computer-supported collaborative learning, Neural models applied to AIED systems, Nonverbal gestures, hand-over face gestures
Depositing User: Symplectic Admin
Date Deposited: 30 Jan 2020 14:34
Last Modified: 19 Jan 2023 00:05
URI: https://livrepository.liverpool.ac.uk/id/eprint/3072698