Combining attention mechanism with GhostUNet method for pavement crack detection
DOI:
Author:
Affiliation:

1.School of Information Science and Technology,Shijiazhuang Tiedao University,Shijiazhuang 050043,China; 2.State Key Lab of Mechanical Behavior and System Safety of Traffic Engineering Structures,Shijiazhuang Tiedao University,Shijiazhuang 050043,China

Clc Number:

TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Pavement crack is the most common defect of road. With the development of deep learning technology, more and more methods are used to extract the crack information from pavement images. Aiming at the problems of low accuracy and lack of real-time due to incomplete extraction of crack features by existing deep learning pavement crack detection methods, a road crack detection method combining attention mechanism and GhostUNet is proposed. This method is composed of encoder and decoder. The conventional convolution in U-Net is improved to Ghost convolution and the number of model parameters is reduced. In coding and decoding, in order to improve the ability to extract crack features, ECA attention mechanism and residual connection are introduced. ECA attention module can filter irrelevant feature information, and residual connection can be used to avoid network degradation. To evaluate the effectiveness of this method in fracture detection, two publicly available fracture data sets were used, and ablation and comparison experiments were conducted. The experimental results of F1_score, P and R increased by 14.48%, 14.35% and 14.45%, respectively, compared with U-Net. The number of parameters in this model decreased by 14.2 MB compared with U-Net. Compared with similar models, this model has higher segmentation accuracy and fewer parameters.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: March 27,2024
  • Published: