Image thresholding method guided by maximizing similarity of multi-directional weighted intuitionistic fuzzy
DOI:
Author:
Affiliation:

1.Hubei Key Laboratory of Intelligent Vision Based Monitoring for Hydroelectric Engineering, China Three Gorges University,Yichang 443002, China; 2.College of Computer and Information Technology, China Three Gorges University, Yichang 443002, China

Clc Number:

TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    To deal with the issues of poor segmentation accuracy and adaptability in existing thresholding segmentation methods, an image thresholding method guided by maximizing similarity of multi-directional weighted intuitionistic fuzzy is proposed. First, the proposed method utilizes convolution kernels based on first-order derivative of anisotropic Gaussian to perform multi-directional convolution operation and multi-scale product transformation on an input image, which will output four reference images with unimodal histogram in four directions. Then, it constructs the corresponding intuitionistic fuzzy sets by sampling four reference images with a binary contour image. Finally, it utilizes a multi-directional weighting strategy to fuse four intuitionistic fuzzy sets to construct a similarity objective function, and selects the gray level corresponding to the maximum value of this objective function as the segmentation threshold. The proposed method is comprehensively compared with 5 recent segmentation methods, and the experimental results on 8 synthetic images and 88 real-world images show that the proposed method has higher segmentation accuracy and more flexible adaptability, and the average Matthews correlation coefficients are 0.998 and 0.964 for the synthetic images and real-world ones, which outperform the second-best method by 39.90% and 26.22%, respectively.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: May 15,2024
  • Published: