Laser & Optoelectronics Progress, Volume. 60, Issue 10, 1010012(2023)

Skeleton Action Recognition Based on Dense Residual Shift Graph Convolutional Network

Tao Yang1,2、*, Jun Han1,2, and Haiyan Jiang1,2
Author Affiliations
  • 1College of Communication and Information Engineering, Shanghai University, Shanghai 200444, China
  • 2Shanghai Institute of Advanced Communication and Data Science, Shanghai 200444, China
  • show less

    In order to solve the problem of recognition results of similar behaviors not being ideal owing to insufficient extraction of spatio-temporal features, a large amount of network computing, and low computing efficiency in human skeleton behavior recognition, a skeleton behavior recognition algorithm based on dense residual shift bitmap convolution network is proposed. The pose estimation algorithm is used to extract human skeleton information, and the joint, skeleton, and motion information of the skeleton are calculated by coordinate vector, and input into the network respectively. The dense residual structure is introduced between the shift graph convolution modules to improve the network performance and efficiency of extracting spatio-temporal features. The proposed algorithm can be applied to daily behavior, such as walking, sitting, standing up, undressing, dressing, throwing, and falling. The recognition accuracy on the self-made dataset is 81.7%, and under the two evaluation criteria of NTU60 RGB+D dataset, the accuracy is 88.1% and 95.3%, respectively, thus validating that the algorithm has excellent recognition accuracy.

    Tools

    Get Citation

    Copy Citation Text

    Tao Yang, Jun Han, Haiyan Jiang. Skeleton Action Recognition Based on Dense Residual Shift Graph Convolutional Network[J]. Laser & Optoelectronics Progress, 2023, 60(10): 1010012

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category: Image Processing

    Received: Jan. 4, 2022

    Accepted: Feb. 25, 2022

    Published Online: May. 17, 2023

    The Author Email: Yang Tao (983785320@qq.com)

    DOI:10.3788/LOP220428

    Topics