基于深度卷积和门控循环神经网络的 传感器运动识别
DOI:
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP212

基金项目:

国家自然科学基金(61203237)、江苏省自然科学基金(BK20191371)资助项目


Deep convolutional and gated recurrent neural networks for sensorbased activity recognition
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    用于可穿戴传感器的人体运动识别任务的传统机器学习方法通常需要手工提取特征,可以自动提取人体运动数据特征的深度神经网络正成为新的研究热点。目前将卷积神经网络(CNN)和长短期记忆(LSTM)循环神经网络组合而成的DeepConvLSTM在识别精度方面有着优于其他识别方法的表现。针对带有长短期记忆循环单元的神经网络的训练较为困难的问题,提出了一种基于卷积神经网络和门控循环单元(GRU)的融和模型,并在3个公开数据集(ACT数据集、UCI数据集和OPPORTUNITY数据集)上与卷积神经网络和DeepConvLSTM进行了性能对比。实验结果显示,该模型在3个公开数据集上的识别精度都高于卷积神经网络,与DeepConvLSTM相当,但是收敛速度比DeepConvLSTM更快。

    Abstract:

    The traditional machine learning methods for wearable sensorbased human activity recognition tasks usually require manual feature extraction, and the deep neural network that can automatically extract human activity data features is becoming a new research focus. At present, DeepConvLSTM, which is combined of convolutional neural network (CNN) and long shortterm memory (LSTM) recurrent neural network, has better recognition accuracy than other recognition methods. To solve the difficulty of training neural networks with long shortterm memory recurrent unit, the paper proposes a fusion model based on convolutional neural network and gated recurrent unit (GRU), and the performance on three public data sets (ACT data set, UCI data set and OPPORTUNITY data set) is compared with convolutional neural network and DeepConvLSTM. The experimental results show that the recognition accuracy of the model on three public data sets is higher than that of convolutional neural network and is close to DeepConvLSTM, but the convergence speed of the model is faster than that of DeepConvLSTM.

    参考文献
    相似文献
    引证文献
引用本文

王震宇,张 雷.基于深度卷积和门控循环神经网络的 传感器运动识别[J].电子测量与仪器学报,2020,34(1):1-9

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-06-15
  • 出版日期: 2020-01-31
文章二维码