Target recognition and location of steel bar binding robot based on deep learning
DOI:
Author:
Affiliation:

School of Electrical and Information Engineering, Beijing University of Civil Engineering and Architecture, Beijing 100044, China

Clc Number:

TP391.4

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In order to solve the problem of low recognition accuracy and poor positioning accuracy of steel bar binding robot, a target recognition and positioning method of steel bar binding robot based on deep learning is proposed. Firstly, YOLOv4 algorithm is used to identify and cut the target frame of binding point, and the initial positioning of binding point is completed. Secondly, the contour corner selection method is designed to calculate the image coordinates of the binding points by corner points. Then the feature extraction part of Monodepth algorithm is improved by integrating CBAM attention mechanism, and the decoding part introduces path aggregation network structure to improve the feature extraction ability of the model and further improve the stereo matching accuracy. Finally, the depth information of binding points is obtained by binocular stereo vision positioning technology, and the coordinate transformation is used to solve the mapping relationship between the hand-eye coordinate system of the steel bar binding robot, so as to realize the accurate identification and positioning of binding points. The experimental results show that the recognition accuracy of this method for binding point target frame reaches 99.75%, and the number of frames per second reaches 54.65. The Maximum error of positioning accuracy in space is 11.6mm. It can better identify and locate the binding point position, and provide strong support for automatic binding work.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 25,2024
  • Published: