遥感图像云检测的多尺度融合分割网络方法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TN911TP751

基金项目:

国家自然科学基金(61571160)项目资助


Cloud detection in remote sensing images with multilevel scale fused network
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    在进行可见光遥感图像高精度云检测时,云自身特征的多变性,以及地物与云之间的特征相似性,会降低检测精度。因此,提出一种带权重的多尺度融合分割网络云检测方法。首先,通过有云区域和无云区域的特征学习,降低对云状的敏感性,同时利用全卷积网络进行端到端训练,实现对每个像素点分类。该方法能够自动提取深层特征,并可将云的深层语义特征与浅层细节特征结合,不但有利于区分下垫面中与云特征相似的地物,还可提高云边缘检测效果,从而提升云量值的检测精度。与其他深度学习分割网络的实验比较分析表明,所提方法可以实现9539%的像素分类准确度,云量值检测误差优于1%,为解决遥感图像云污染问题提供了新的思路。

    Abstract:

    When highprecision cloud detection is implemented on visible spectral remote sensing images, the variability of the cloud form and the similarity between the cloud area and the earth object will reduce detection accuracy. To address this problem, this paper proposes a weighted multilevel scale fused network (WMSFNet), which can be trained endtoend without manual intervention. Firstly, the sensitivity to the cloud form is reduced by learning cloud area and earth object in turn. Meanwhile, WMSFNet can automatically extract highlevel spatial features through the fully convolutional network. In this way, cloud and earth object can be distinguished at the pixel level. A multilevel feature fused structure is designed to combine semantic information with spatial information from different levels. The detection of segmentation boundaries can be enhanced. Experiments performed on several real remote sensing images demonstrate that the proposed method can reach pixel accuracy of 9539%, which is better than other stateoftheart semantic segmentation methods. The error rate of cloud fraction is less than 1%, which provides a new solution for cloudcontaminated remote sensing images.

    参考文献
    相似文献
    引证文献
引用本文

郭玥,于希明,王少军,彭宇.遥感图像云检测的多尺度融合分割网络方法[J].仪器仪表学报,2019,40(6):31-38

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2022-02-10
  • 出版日期: