Can Automated Gesture Recognition Support the Study of Child Language Development?



Samanta, S ORCID: 0000-0003-2200-3061, Bannard, C ORCID: 0000-0001-5579-5830 and Pine, J ORCID: 0000-0002-7077-9713
(2020) Can Automated Gesture Recognition Support the Study of Child Language Development? In: 42nd Annual Meeting of the Cognitive Science Society., 2020-7-29 - 2020-8-1.

This is the latest version of this item.

[img] Text
ss_cogsci_2020_July.pdf - Published version

Download (4MB) | Preview

Abstract

Children's prelinguistic gestures play a central role in their communicative development. Early gesture use has been shown to be predictive of both concurrent and later language ability, making the identification of gestures in video data at scale a potentially valuable tool for both theoretical and clinical purposes. We describe a new dataset consisting of videos of 72 infants interacting with their caregivers at 11&12 months, annotated for the appearance of 12 different gesture types. We propose a model based on deep convolutional neural networks to classify these. The model achieves 48.32% classification accuracy overall, but with significant variation between gesture types. Critically, we found strong (0.7 or above) rank order correlations between by-child gesture counts from human and machine coding for 7 of the 12 gestures (including the critical gestures of declarative pointing, hold outs and gives). Given the challenging nature of the data - recordings of many different dyads in different environments engaged in diverse activities - we consider these results a very encouraging first attempt at the task, and evidence that automatic or machine-assisted gesture identification could make a valuable contribution to the study of cognitive development.

Item Type: Conference or Workshop Item (Unspecified)
Depositing User: Symplectic Admin
Date Deposited: 05 Aug 2020 09:02
Last Modified: 18 Jan 2023 23:47
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3092548

Available Versions of this Item