Research on positioning and detection technology of parallel robot based on deep learning
DOI:
Author:
Affiliation:

1.North University of China, School of mechanical engineering, Taiyuan Shanxi 030051, China;2.Shanxi Crane Digital Engineering Technology Research Center, Taiyuan Shanxi 030051

Clc Number:

TP242.2

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Aiming at the problems of fuzzy target recognition, poor classification efficiency and slow response speed of parallel robot in the field of machine vision, a positioning and detection technology of parallel robot based on deep learning is proposed. Firstly, put the parallel robot into the image collection set to improve the image recognition accuracy and improve the object recognition efficiency; Secondly, improve the training mode, improve the reliability and loss strategy through pre-training and actual training; Then, the base coordinate system and camera coordinate system of the parallel robot are established. Combined with the hand eye calibration and camera calibration methods, the transformation relationship between the actual coordinates of the target and the base coordinate system of the robot is obtained; Finally, the target calibration results are verified on the parallel robot experimental platform. Compared with the relative error between the network positioning and actual positioning of the parallel mechanism obtained by the mainstream deep learning algorithms YOLOv3, YOLOv4 and Faster-RCNN, the results show that the positioning accuracy error of YOLOX is about 3.992-5.061mm, and the average accuracy is about 91%. This method can provide a certain reference value for the detection and positioning of parallel robot combined with deep learning.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 25,2024
  • Published: