Deep learning gesture recognition based on multi-feature combination in complex background
DOI:
Author:
Affiliation:

Physics & Electronic Information Engineering, Henan Polytechnic University,Jiaozuo 454003, China

Clc Number:

TP391.41

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    With the rapid development of science and technology, human-computer interaction based on deep learning has found widespread applications. As an important component of human-computer interaction, gesture recognition holds significant research and application value. However, traditional gesture recognition methods utilizing skin color detection algorithms have limited effectiveness in recognizing gestures against complex backgrounds. To address this problem, a novel gesture recognition method based on convolutional neural network that combines skin color and edge features is proposed. Initially, the ellipse skin color model and Otsu threshold skin color recognition algorithm are used to obtain gesture skin color features in the YCrCb color space. Subsequently, the improved Canny edge detection algorithm is used to obtain the edge features of the gesture skin color images. Following this, the edge filling method is used to process the edge images. Finally, the gesture segmentation images are obtained by logical operation and morphological operation, which are as input to the convolutional neural network for training and recognition. Experimental results demonstrate the effectiveness of approach, with an average recognition rate of 98.83% on the NUS hand posture dataset II. The proposed method shows a significant improvement over traditional gesture recognition methods and can effectively recognize gestures against complex backgrounds.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: March 21,2024
  • Published: