基于多视图与注意力机制的睡眠脑电分期
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

R318

基金项目:

国家自然科学基金青年科学基金(62101189)、浙江省自然科学基金(LTGC23F010001)项目资助


Sleep EEG staging based on multi-view and attention mechanism
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为了更全面地对睡眠脑电进行特征提取,提出一种基于多视图与注意力机制的睡眠脑电分期方法。首先针对原始睡 眠脑电信号构造时域和时频域两类视图数据;然后设计融合注意力机制的混合神经网络对多视图数据进行表征学习;接着通 过双向长短时记忆(bi-directional long short-term memory,BiLSTM)网络进一步学习睡眠阶段之间的转换规则;最后使用 Softmax函数进行睡眠分期,并利用类别加权损失函数解决睡眠数据类别不均衡的问题。实验使用 Sleep-EDF 数据库中前20 名受试者的单通道脑电信号并采用20折交叉验证对模型进行性能评估,睡眠分期准确率达到83.7%,宏平均 F₁ 值达到 79.0%,Cohen's Kappa 系数达到0.78。与现有方法相比,算法性能提升明显,证明了所提方法的有效性。

    Abstract:

    In order to extract features of sleep EEG more comprehensively,a sleep EEG staging method based on multi- view and attention mechanism is proposed.First,two types of view data,namely time domain and time-frequency domain,are constructed based on the original sleep EEG signal.Then,a hybrid neural network with attention mechanism is designed to perform representation learning on multi-view data.Next,the transition rules between sleep stages are further learned through a bidirectional long short-term memory network.Finally,the Softmax function is used for sleep staging,and the class weighted loss function is utilized to solve the problem of unbalanced sleep data categories. In this experiment,the single-channel EEG signals of the first 20 subjects in the Sleep-EDF database are used,and 20- fold cross-validation is adopted to evaluate the performance of the model.The accuracy of sleep staging reaches 83.7%, the macro-F₁-score(MF₁)reaches 79.0%,and the Cohen's Kappa coefficient reaches 0.78.Compared with the existing methods,the performance of the algorithm in this paper is significantly improved,which proves the effectiveness of the proposed method

    参考文献
    相似文献
    引证文献
引用本文

李兰亭,苗敏敏.基于多视图与注意力机制的睡眠脑电分期[J].国外电子测量技术,2024,43(1):30-37

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-05-28
  • 出版日期: