Dynamic Semantic-Based Spatial Graph Convolution Network for Skeleton-Based Human Action Recognition



Xie, Jianyang, Meng, Yanda, Zhao, Yitian, Nguyen, Anh ORCID: 0000-0002-1449-211X, Yang, Xiaoyun and Zheng, Yalin
(2024) Dynamic Semantic-Based Spatial Graph Convolution Network for Skeleton-Based Human Action Recognition. .

[img] Text
AAAI_7708 (3).pdf - Author Accepted Manuscript
Available under License Creative Commons Attribution.

Download (4MB) | Preview

Abstract

<jats:p>Graph convolutional networks (GCNs) have attracted great attention and achieved remarkable performance in skeleton-based action recognition. However, most of the previous works are designed to refine skeleton topology without considering the types of different joints and edges, making them infeasible to represent the semantic information. In this paper, we proposed a dynamic semantic-based graph convolution network (DS-GCN) for skeleton-based human action recognition, where the joints and edge types were encoded in the skeleton topology in an implicit way. Specifically, two semantic modules, the joints type-aware adaptive topology and the edge type-aware adaptive topology, were proposed. Combining proposed semantics modules with temporal convolution, a powerful framework named DS-GCN was developed for skeleton-based action recognition. Extensive experiments in two datasets, NTU-RGB+D and Kinetics-400 show that the proposed semantic modules were generalized enough to be utilized in various backbones for boosting recognition accuracy. Meanwhile, the proposed DS-GCN notably outperformed state-of-the-art methods. The code is released here https://github.com/davelailai/DS-GCN</jats:p>

Item Type: Conference or Workshop Item (Unspecified)
Divisions: Faculty of Science and Engineering > School of Electrical Engineering, Electronics and Computer Science
Depositing User: Symplectic Admin
Date Deposited: 15 Apr 2024 07:44
Last Modified: 15 Apr 2024 07:44
DOI: 10.1609/aaai.v38i6.28440
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3180307