Generation method of annotation data set of automatic loading and unloading objects based on generative adversarial network
DOI:
CSTR:
Author:
Affiliation:

1. School of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China; 2. Foshan Cangke Intelligent Technology Co., LTD, Foshan 528225, China

Clc Number:

TP391.4

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Aiming at the time-consuming problem of establishing deep learning labeling data for unmanned lifting target detection, a cargo image detection generation admision network was designed to construct an accurate data set containing semantic labeling and key point labeling, which could be used for the training of supervised deep learning semantic segmentation model. The generative adversation network of StyleGAN and DatasetGAN was fused to improve the semantic feature deformation in practical applications. The sample normalization layer of generator was modified to remove the mean operation and modify the input mode of noise module and style control factor. To solve the problem of weak coding ability of spatial position of objects with single texture feature, the constant input of generating network is replaced by Fourier feature, and a module integrating nonlinear up-down sampling is proposed. Finally, WGAN-GP is introduced to improve the objective function. Using deeplab-V3 as evaluation network and DatasetGAN as baseline, the output mIOU value of Deeplab-V3 increases by 14.83% on average in semantic label generation task, and L2 loss decreases by 0.4×10-4 on average in key point label generation task. PCK value is increased by 5.06% on average, which verifies the feasibility and advance of the improved generative adversarial network generation semantics and key point annotation data.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 02,2024
  • Published:
Article QR Code