High-resolution human pose estimation network based on feature enhancement
DOI:
Author:
Affiliation:

School of Information Engineering, Jiangxi University of Science and Technology,Ganzhou 341000, China

Clc Number:

TP391.41

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In order to solve the problem of insufficient extracted features in high-resolution human pose estimation using lightweight convolutional neural network, a high-resolution human pose estimation network based on feature enhancement is proposed in this paper. Firstly, the dilated convolution completion operation was used to extract image features to avoid the loss of feature information and basically keep the model parameters unchanged. Then, the pooling enhancement module was used to select the features of convolution extraction, which retained important features and reduced the damage caused by traditional pooling module on extracted features. Finally, the depthwise separable convolution module that strengthens the channel information interaction was used for feature extraction, so as to keep the number of parameters of the module small and improve its feature extraction ability. The performance of the proposed algorithm and DiteHRNet-30 algorithm were tested on the COCO2017 dataset. The AR values of the proposed algorithm and DiteHRNet-30 algorithm are 77.9% and 77.2%, respectively. The performance of the proposed algorithm and DiteHRNet-30 algorithm are tested on the MPII dataset. The PCKh values of the proposed algorithm and DiteHRNet-30 algorithm are 32.6% and 31.7%, respectively. Experimental results show that the proposed algorithm can achieve a good balance between the accuracy of human pose estimation and the complexity of the algorithm.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 30,2024
  • Published: