EEG Emotion Recognition based on Multi scale Self Attention Convolutional Networks

Authors

DOI:

https://doi.org/10.4108/eetel.3722

Keywords:

Multi-Channel EEG Signal, Emotional Recognition, Multi-Scale Convolutional Network, Self-Attention Network

Abstract

A multi-view self-attention module is proposed and paired with a multi-scale convolutional model to build
a multi-view self-attention convolutional network for multi-channel EEG emotion recognition. First, time
and frequency domain characteristics are extracted from multi-channel EEG signals, and a three-dimensional
feature matrix is built using spatial mapping connections. Then, a multi-scale convolutional network extracts
the high-level abstract features from the feature matrix, and a multi-view self-attention network strengthens
the features. Finally, use the multilayer perceptron for sentiment classification. The experimental results reveal
that the multi-view self-attention convolutional network can effectively integrate the time domain, frequency
domain, and spatial domain elements of EEG signals using the DEAP public emotion dataset. The multi-view
self-attention module can eliminate superfluous data, apply attention weight to the network to hasten network
convergence, and enhance model recognition precision.

References

Houssein E. H., Hammad A. and Ali A A., Human emotion recognition from EEG-based brain-computer interface using machine learning: a comprehensive review [J]. Neural Computing, & Applications, vol. 2022, 34(15): 12527-57.

Li X., Zhang Y. Z. and Tiwari P. et al., EEG Based Emotion Recognition: A Tutorial and Review [J]. Acm Computing Surveys, vol. 2023, 55(4).

Takahashi K., Remarks on emotion recognition from multi-modal bio-potential signals [C]. IEEE International Conference on Industrial Technology, IEEE, 04 2004 International Conference on, F, 2005.

Riedl M., Müller A. and Wessel N., Practical considerations of permutation entropy [J]. European Physical Journal Special Topics, vol. 2013, 222(2): 249-62.

Namazi H., Aghasian E. and Ala T. S. et al., Complexitybased classification of EEG signal in normal subjects and patients with epilepsy [J]. Technology and health care: official journal of the European Society for Engineering and Medicine, vol. 2019, 28(1): 1-10.

Toole JO., Discrete quadratic time-frequency distributions: Definition, computation, and a newborn electroencephalogram application [J]. algorithms, vol.2013.

Alazrai R., Homoud R. and Alwanni H. et al., EEGBased Emotion Recognition Using Quadratic TimeFrequency Distribution [J]. Sensors, vol. 2018, 18(8).

Candra H., Yuwono M. and Chai R. et al., Investigation of window size in classification of EEG-emotion signal with wavelet entropy and support vector machine [C]. Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2015.

Mert A. and Akan A., Emotion recognition from EEG signals by using multivariate empirical mode decomposition [J]. Pattern Analysis and Applications, vol. 2016, 21(1)

Rahman M. M., Sarkar A. K. and Hossain M. A. et al., Recognition of human emotions using EEG signals: A review [J]. Comput Biol Med, vol. 2021, 136(2): 104696.

Song T., Zheng W. and Song P. et al., EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks [C]. IEEE Transactions on Affective Computing, vol. 2020, 11, no. 3, pp. 532-541, 1 July-Sept.

Wen Z., Xu R. and Du J., A novel convolutional neural networks for emotion recognition based on EEG signal [C]. 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), IEEE, 2017.

Zhong P., Wang D. and Miao C., EEG-Based Emotion Recognition Using Regularized Graph Neural Networks [J]. Transactions on Affective Computing, vol. 2022, 13, no.3, pp. 1290-1301, 1 July-Sept.

Woo S., Park J. and Lee J. Y. et al., Universals and cultural differences in the judgments of facial expressions of emotion [J]. Pers Soc Psychol, vol. 1987, 53(4): 712.

Plutchik R., The nature of emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice [J]. American Scientist, vol. 2001, 89(4): 344-50.

Koelstra S., Muhl C. and Soleymani M. et al., Deap: A database for emotion analysis; using physiological signals [J]. IEEE transactions on affective computing, vol. 2011, 3(1): 18-31.

Pincus S. M., Approximate entropy as a measure of system complexity [J]. Proceedings of the National Academy of Sciences of the United States of America, vol. 1991, 88(6): 2297-301.

Ergin T., Ozdemir M. A. and Guren O., Emotion detection using EEG signals based on Multivariate Synchrosqueezing Transform and Deep Learning [C]. Medical Technologies Congress (TIPTEKNO), IEEE, 2021,F.

Zhang Y., Liu H. and Zhang D. et al., EEG-based emotion recognition with emotion localization via hierarchical self-attention [J]. IEEE Transactions on Affective Computing, vol. 2022.

Islam R., Islam M. and Rahman M. M. et al., EEG channel correlation based model for emotion recognition [J]. Computers in Biology and Medicine, vol. 2021, 136:104757.

Topic A. and Russo M., Emotion recognition based on EEG feature maps through deep learning network [J]. Engineering Science and Technology an International Journal, vol. 2021, 24(6): 1442-54.

Li Q., Liu Y. and Liu C. et al., EEG signal processing and emotion recognition using Convolutional Neural Network [C]. 2021 International Conference on Electronic Information Engineering and Computer Science (EIECS), IEEE, 2021, F.

Jiménez-Guarneros M. and Alejo-Eleuterio R., A Class-Incremental Learning Method Based on Preserving the Learned Feature Space for EEG-Based Emotion Recognition [J]. Mathematics, vol. 2022, 10(4): 598.

Gao Y., Fu X. and Ouyang T. et al., A Class-Incremental Learning Method Based on Preserving the Learned Feature Space for EEG-Based Emotion Recognition [J]. Adv. Eng. Informatics, vol. 2022, 29: 1574-8.

Downloads

Published

06-09-2023

How to Cite

[1]
H. Chao and F. Yuan, “EEG Emotion Recognition based on Multi scale Self Attention Convolutional Networks ”, EAI Endorsed Trans e-Learn, vol. 8, no. 4, p. e4, Sep. 2023.