Motor imagination EEG classification based on MTACNet
DOI:
CSTR:
Author:
Affiliation:

1.College of Electronics and Information Engineering, Sichuan University,Chengdu 610065, China; 2.Department of Electronic and Information Engineering, Chengdu Jincheng College,Chengdu 611731, China

Clc Number:

TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In order to make better use of the relevant features in EEG signals and improve the classification performance of motor imagery EEG, a multilayer convolutional network (MTACNet) based on mixed features and parallel multiscale TCN modules was constructed. First, build a multilayer convolutional neural network based on mixed features, and embed an efficient channel attention mechanism in it, and select PReLU as the activation function to extract the temporal and spatial information in the EEG signal; then improve the TCN module, build a parallel multiscale timedomain feature extraction module, connect to a multilayer convolutional network, and further mine feature information at different time scales. Tested on the public dataset BCI_IV_2a and the selfcollected dataset SCU_MI_EEG, the average classification accuracy rates are 8615%, 7710%, and the standard deviations are 917%, 1358%, respectively. And for the selfcollected data set, a preprocessing method was designed to fuse multifrequency domain EEG signals for threechannel input. After preprocessing, the average classification accuracy rate increased by 329%. The experimental results show that: Compared with other methods, the classification network constructed in this paper has achieved relatively good classification results, and the designed preprocessing method can reduce the impact of complex environments and irrelevant interference factors on the classification results.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: January 04,2024
  • Published:
Article QR Code