结合残差网络与多级分块结构的步态识别方法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP18

基金项目:

国家重点研发计划(2018YFB1601200)、中国民航大学天津市智能信号与图像处理重点实验室开放基金(2019ASPTJ06)项目资助


Gait recognition method combining residual network and multi-level block structure
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对步态识别中由于衣着与背包的遮挡造成不能提取有鉴别性的步态特征,从而导致识别准确率不高的问题,提出一 种结合残差网络和多级分块结构的步态识别方法。 首先在水平方向上对步态能量图进行不同尺度的多级分块,以提取不同区 域的细粒度特征,减少局部遮挡对于其他区域的影响,同时为了更好地学习在步态中运动频率最高区域的特征,在腿部加入 Inception 模块;其次为了提升网络模型的识别精度,结合交叉熵损失、三元组损失、L2 正则化对残差网络的权值进行约束。 最 后在公开的步态数据集 CASIA-B 和 OU-ISIR Treadmill B 上进行实验,在携带背包或不同衣着条件下的识别率分别达到了 87. 5%、82. 6%,表明该模型对于衣着与携带背包的条件具有鲁棒性。

    Abstract:

    In gait recognition, the discriminative gait feature cannot be extracted due to the occlusion of clothing and backpack, which leads to the low recognition accuracy. A gait recognition method combining ResNet and multi-level block structure is proposed in this paper. First of all, the gait energy map is divided into different scales in the horizontal direction to extract the fine-grained features of different regions, which reduce the impact of local occlusion on other regions. At the same time, in order to better learn the characteristics of the region with the highest motion frequency, the Inception module is added. Secondly, in order to improve the recognition accuracy of the network model, cross-entropy loss, triple loss and L2 regularization are utilized to constrain the weight of the residual network. Finally, experiments were processed in the public gait data set CASIA-B and OU-ISIR Treadmill B, and the recognition rate reached 87. 5% and 82. 6% under different clothing or backpack conditions. It is indicated that under these conditions, the method could obtain favorable veracity and good robustness.

    参考文献
    相似文献
    引证文献
引用本文

张红颖,田鹏华.结合残差网络与多级分块结构的步态识别方法[J].电子测量与仪器学报,2022,36(6):66-72

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-03-06
  • 出版日期: