Two-person interaction behavior recognition based on joint data
DOI:
CSTR:
Author:
Affiliation:

Clc Number:

TP391. 41

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In recent years, significant progress has been made in two-person interaction recognition based on RGB video, but there are still many problems in RGB video data that seriously affect the recognition rate of two-person interaction. With the rapid development of depth sensors (such as the Microsoft Kinect), it is possible to directly obtain a data point that can track human movement, making up for the lack of RGB video data. Therefore, a two-person interaction behavior recognition method based on node data is proposed. First, HOJ3D features and joint distance features were calculated from the data of the node, and then were graphically sent into different convolutional neural networks. Then, the two features were extracted and splicedtogether. Then, Softmax classifier was used for classification and recognition. The test results of the method on the SBU Kinect action dataset show that the recognition accuracy of the method has been improved to a certain extent, reaching 94. 4%. The method is simple to implement, has the ability of real-time processing, and has a good application prospect.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: November 20,2023
  • Published:
Article QR Code