Automatic identification for reading of pointer-type meters based on deep learning
DOI:
Author:
Affiliation:

1.School of Quality and Technical Supervision, Hebei University,Baoding 071002, China; 2.School of Cyberspace Security and Computer, Hebei University,Baoding 071002, China; 3.Beijing ConST Instrument Technology Co., Inc.,Beijing 100094, China; 4.Baoding Cigarette Factory, Hebei Baisha Tobacco Co., Ltd., Baoding 071000, China

Clc Number:

TP216;TP391.4;TH86

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Pointer-type meters are widely applied into the petrochemical, industrial manufacturing, and the power sector of the tobacco industry. Given the low frequency of manual inspection and the hostile environment of part installed meters, the security risks exist in the procedures of the factories production, and personal safety of inspectors is difficult to be guaranteed. Based on the existing surveillance camera system of the procedures of the factories production, this paper proposes an automatic identification method of reading of pointer-type meters based on YOLO V3 object detection and DeepLab V3+image segmentation techniques. YOLO V3 is introduced to detect and incise the sub-images of the gauge dial. According to the characteristics of the meter images and actual needs, this paper improves the structure of DeepLab V3+and add corrosion treatment. So the scale lines and pointers of the sub-images are located effectively. The gauge range is extracted from the sub-images by OCR technology. And in combination with the relative positional relationship of the scale lines and pointers, the identification of meters reading can be computed. It is proved by experiment that the average relative error of reading of the meter image identified is 2.17%, and the proposed method can meet the applications requirements.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: February 22,2024
  • Published: