Journal of Terahertz Science and Electronic Information Technology , Volume. 21, Issue 5, 661(2023)

Group activity recognition based on attention mechanism and spatio-temporal information

JIANGXue*, QING Linbo, HUANGJianglan, and LIU Bo
Author Affiliations
  • [in Chinese]
  • show less

    Group activity recognition is to identify the common activity of individuals in space. It is closely related not only to group state, but also to the individual spatio-temporal features which can not only describe the semantic information in space, but also reflect the dynamic changes of activity. A group activity recognition method is proposed based on attention mechanism and deep spatio-temporal information. Firstly, the advanced ShuffleAttention is introduced into the two-stream feature extraction network to extract individual appearance and motion information. Secondly, the improved non-local network is employed to extract the deep temporal information. Finally, the individual features are input into the graph convolutional neural network to model the interaction and spatial information, recognizing group activity. The accuracies on Collective Activity Dataset(CAD) and Collective Activity Extended Dataset(CAED) reach 93.6% and 97.8% respectively. Compared with Cohesive Cluster Search(CCS) and Actor Relation Graph(ARG) methods, the accuracy of the proposed method on CAD datasets is 1.2% and 2.6% higher, indicating that the proposed method can effectively extract deep spatio-temporal features and improve the accuracy of group activity recognition.

    Tools

    Get Citation

    Copy Citation Text

    JIANGXue, QING Linbo, HUANGJianglan, LIU Bo. Group activity recognition based on attention mechanism and spatio-temporal information[J]. Journal of Terahertz Science and Electronic Information Technology , 2023, 21(5): 661

    Download Citation

    EndNote(RIS)BibTexPlain Text
    Save article for my favorites
    Paper Information

    Category:

    Received: Jul. 16, 2022

    Accepted: --

    Published Online: Jan. 17, 2024

    The Author Email: JIANGXue (1015475773@qq.com)

    DOI:10.11805/tkyda2022139

    Topics