Research on EEG emotion recognition based on multi-domain information fusion
DOI:
CSTR:
Author:
Affiliation:

1.School of Electronic and Optical Engineering, Nanjing University of Posts and Telecommunications,Nanjing 210023, China; 2.Nation-Local Joint Project Engineering Lab of RF Integration & Micropackage, Nanjing University of Posts and Telecommunications, Nanjing 210023, China

Clc Number:

TP391

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    EEG signal recognition methods rarely integrate spatial, temporal and frequency information, in order to fully explore the rich information contained in EEG signals, this paper proposes a multi-information fusion EEG emotion recognition method. The method utilizes a parallel convolutional neural network model (Parallel Convolutional Neural Network, PCNN) that combines a two-dimensional convolutional neural network(2D-CNN) and a one-dimensional convolutional neural network(1D-CNN) to learn the spatial, temporal, and frequency features of the EEG signals to categorize the human emotional states. Among them, 2D-CNN is used to mine spatial and frequency information between neighboring EEG channels, and 1D-CNN is used to mine temporal and frequency information of EEG. Finally, the information extracted from the two parallel CNN modules is fused for emotion recognition. The experimental results of emotion triple classification on the dataset SEED show that the overall classification accuracy of the PCNN fusing spatial, temporal, and frequency features reaches 98.04%, which is an improvement of 1.97% and 0.60%, respectively, compared to the 2D-CNN extracting only null-frequency information and the 1D-CNN extracting temporal-frequency information. And compared with recent similar work, the method proposed in this paper is superior for EEG emotion classification.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: April 30,2024
  • Published:
Article QR Code