Feature Separation Graph Convolutional Networks for Skeleton-Based Action Recognition

Loading...
Thumbnail Image
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Graph Convolutional Networks have made significant advancements in skeleton-based action recognition. However, most existing methods process body features globally, overlooking the challenges posed by partial visual occlusion, which severely impair the model's recognition capabilities when body parts are obscured. To address this issue, we propose Feature Separation Graph Convolutional Networks (FS-GCN), consisting of Feature Separation Modeling (FSM) and Exchange Modeling (EM). FSM strategically separates the skeleton feature into essential body parts, placing emphasis on upper body features while seamlessly integrating lower body features. This allows FS-GCN to better capture the distinctive spatial and temporal characteristics associated with each body segment. EM facilitates the swapping of body half correlation matrices between different graph convolution modules, eliminating discrepancies and enabling a more robust and unified global information processing framework. Furthermore, FS-GCN divides the adaptive graph into two key parts for graph contrastive learning to extract more intra-class contrastive information during training process. FS-GCN achieves state-of-the-art performance on NTU RGB+D, NTU RGB+D 120, and NW-UCLA datasets, especially in line-of-sight-obstructed scenarios.
Description

CCS Concepts: Computing methodologies → Activity recognition and understanding; Neural networks

        
@inproceedings{
10.2312:pg.20241317
, booktitle = {
Pacific Graphics Conference Papers and Posters
}, editor = {
Chen, Renjie
and
Ritschel, Tobias
and
Whiting, Emily
}, title = {{
Feature Separation Graph Convolutional Networks for Skeleton-Based Action Recognition
}}, author = {
Zhang, Lingyan
and
Ling, Wanyu
and
Daizhou, Shuwen
and
Kuang, Li
}, year = {
2024
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-250-9
}, DOI = {
10.2312/pg.20241317
} }
Citation