Journal of Northeastern University(Natural Science) ›› 2021, Vol. 42 ›› Issue (6): 768-774.DOI: 10.12068/j.issn.1005-3026.2021.06.002

• Information & Control • Previous Articles     Next Articles

Skeleton-based Action Recognition Method with Two-Stream Multi-relational GCNs

LIU Fang1,2, QIAO Jian-zhong1, DAI Qin3, SHI Xiang-bin2   

  1. 1. School of Computer Science & Engineering, Northeastern University, Shenyang 110169, China; 2. School of Computer Science, Shenyang Aerospace University, Shenyang 110136, China; 3. College of Information, Shenyang Institute of Engineering, Shenyang 110136, China.
  • Revised:2020-09-18 Accepted:2020-09-18 Published:2021-06-23
  • Contact: QIAO Jian-zhong
  • About author:-
  • Supported by:
    -

Abstract: The interaction of human body parts in motion is diverse, but the existing skeleton-based action recognition methods with GCNs(graph convolutional networks) can only model a single relationship between joints. The idea of knowledge graphs was used to describe the different relationships between entities and a two-stream multi-relational GCNs action recognition method was proposed based on joints and body parts, by which the models of the natural connection relationship, symmetric relationship, and global relationship among nodes were established. The features of each relationship were synchronously transmitted and effectively fused in the network. In the process of global cooperation of human body parts, the interaction range of each part was limited and depends on specific actions. An adaptive topK global adjacency relationship calculation method was proposed based on Non-local algorithm, in which the nodes with the topK interaction strength were selected dynamically as the adjacent nodes for each node. The experimental results show that the proposed two-stream multi-relational network achieves good action accuracy on the Kinetics dataset and the NTU-RGB+D dataset.

Key words: action recognition; skeleton; GCNs(graph convolutional networks); multi-relation; topK

CLC Number: