In collaboration with Payame Noor University and the Iranian Society of Instrumentation and Control Engineers

Document Type : Research Article

Authors

1 Department of Computer Engineering‎, ‎Salman Farsi University of Kazerun‎, ‎Kazerun‎, ‎Iran

2 Department of Computer Engineering‎, ‎Mamasani Higher Education Center‎, ‎Mamasani‎, ‎Iran

Abstract

This research explores the prominent signals and presents an effective approach to identify emotional experiences and mental states based on EEG signals‎. ‎First‎, ‎PCA is used to reduce the data's dimensionality from 2K and 1K down to 10 and 15 while improving the performance‎. ‎Then‎, ‎regarding the insufficient high-quality training data for building EEG-based recognition methods‎, ‎a multi-generator conditional GAN is presented for the generation of high-quality artificial data that covers a more complete distribution of actual data by utilizing different generators‎. ‎Finally‎, ‎to perform classification‎, ‎a new hybrid LSTM-SVM model is introduced‎. ‎The proposed hybrid network attained overall accuracy of 99.43% in EEG emotion state classification and showed an outstanding performance in identifying the mental states with accuracy of 99.27%‎. ‎The introduced approach successfully combines two prominent targets of machine learning‎: ‎high accuracy and small feature size‎, ‎and demonstrates a great potential to be utilized in future classification tasks.

Keywords

[1] Ackermann P., Kohlschein C., Bitsch J., Wehrle K., Jeschke S. (2016, September). “EEG-based automatic emotion recognition: Feature extraction, selection and classification methods”, In 2016 IEEE 18th international Conference on e-Health Networking, Applications and Services (Healthcom) (pp. 1-6). IEEE.
[2] Agrafioti F., Hatzinakos D., Anderson A. K. (2011). “ECG pattern analysis for emotion detection”, IEEE Transactions on Affective Computing, 3(1), 102-115.
[3] Alhagry S., Fahmy A. A., El-Khoribi R. A. (2017). “Emotion recognition based on EEG using LSTM recurrent neural network”, Emotion, 8(10), 355-358.
[4] Ali M., Mosa A. H., Al Machot F., Kyamakya K. (2016, July). “EEG-based emotion recognition approach for e-healthcare applications”, In 2016 eighth International Conference on Ubiquitous and Future Networks (ICUFN) (pp. 946-950). IEEE. 
[5] Al Zoubi O., Awad M., Kasabov N. K. (2018). “Anytime multipurpose emotion recognition from EEG data using a Liquid State Machine based framework”, Artificial Intelligence in Medicine, 86, 1-8. 
[6] Bhavan A., Chauhan P., Shah R. R. (2019). “Bagged support vector machines for emotion recognition from speech”, Knowledge-Based Systems, 184, 104886. https://doi.org/10.1016/j.knosys.2019.104886.
[7] Bird J. J., Ekart A., Buckingham C. D., Faria D. R. (2019). “Mental emotional sentiment classification with an eeg-based brain-machine interface”, In Proceedings of the International Conference on Digital Image and Signal Processing (DISP’19).
[8] Bird J. J., Manso L. J., Ribeiro E. P., Ekart A., Faria D. R. (2018, September). “A study on mental state classification using eeg-based brain-machine interface”, In 2018 International Conference on Intelligent Systems (IS) (pp. 795-800). IEEE. 
[9] Bos D. O. (2006). “EEG-based emotion recognition”, The influence of Visual and Auditory Stimuli, 56(3), 1-17.
[10] Chen J. X., Zhang P. W., Mao Z. J., Huang Y. F., Jiang D. M., Zhang Y. N. (2019). “Accurate EEG-based emotion recognition on combined features using deep convolutional neural networks”, IEEE Access, 7, 44317-44328. 
[11] Chen X., Wang Z. J., McKeown M. (2016). “Joint blind source separation for neurophysiological data analysis: Multiset and multimodal methods”, IEEE Signal Processing Magazine, 33(3), 86-107. ]urlhttps://doi.org/10.1109/MSP.2016.2521870.
[12] Chen Y., Cui Y., Wang S. (2017). “Review of emotion recognition based on physiological signals”, System Simulation Technology, 13(1), 1-5.
[13] Cheng J., Chen M., Li C., Liu Y., Song R., Liu A., Chen X. (2020). “Emotion recognition from multi-channel EEG via deep forest”, IEEE Journal of Biomedical and Health Informatics, 
[14] Cui H., Liu A., Zhang X., Chen X., Wang K., Chen X. (2020). “EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network”, Knowledge-Based Systems, 205, 106243 
[15] Debie E., Moustafa N., Whitty M. T. (2020, July). “A privacy-preserving generative adversarial network method for securing EEG brain signals”, In 2020 International Joint Conference on Neural Networks (IJCNN) (pp. 1-8). IEEE. 
[16] Denton E., Chintala S., Szlam A., Fergus R. (2015). “Deep generative image models using a Laplacian pyramid of adversarial networks”, ArXiv preprint arXiv:1506.05751.
[17] Dolan R. J. (2002). “Emotion, cognition, and behavior”, Science, 298(5596), 1191-1194.
[18] Fiorini L., Mancioppi G., Semeraro F., Fujita H., Cavallo F. (2020). “Unsupervised emotional state classification through physiological parameters for social robotics applications”, Knowledge-Based Systems, 190, 105217. 
[19] Goodfellow I. J., Pouget-Abadie J., Mirza M., Xu B., Warde-Farley D., Ozair S., Bengio Y. (2014). “Generative adversarial networks”, ArXiv preprint arXiv:1406.2661.
[20] Grozea C., Voinescu C. D., Fazli S. (2011). “Bristle-sensors-low-cost flexible passive dry EEG electrodes for neurofeedback and BCI applications”, Journal of Neural Engineering, 8(2), 025008.
[21] Jenke R., Peer A., Buss M. (2014). “Feature extraction and selection for emotion recognition from EEG”, IEEE Transactions on Affective Computing, 5(3), 327-339.
[22] Hassouneh A., Mutawa A. M., Murugappan M. (2020). “Development of a real-time emotion recognition system using facial expressions and EEG based on machine learning and deep neural network methods”, Informatics in Medicine Unlocked, 20, 100372.
[23] Hernandez-Matamoros A., Bonarini A., Escamilla-Hernandez E., Nakano-Miyatake M., Perez-Meana H. (2016). “Facial expression recognition with automatic segmentation of face regions using a fuzzy based classification approach”, Knowledge-Based Systems, 110, 1-14. 
[24] Hochreiter S., Schmidhuber J. (1997). “Long short-term memory”, Neural Computation, 9(8), 1735-1780. 
[25] Huang X., Kortelainen J., Zhao G., Li X., Moilanen A., Seppnen T., Pietikinen M. (2016). “Multi-modal emotion analysis from facial expressions and electroencephalogram”, Computer Vision and Image Understanding, 147, 114-124.
[26] Katsigiannis S., Ramzan N. (2017). “DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices”, IEEE Journal of Biomedical and Health Informatics, 22(1), 98-107.
[27] Koelstra S., Muhl C., Soleymani M., Lee J. S., Yazdani A., Ebrahimi T., Patras I. (2011). “Deap: A database for emotion analysis; using physiological signals”, IEEE Transactions on Affective Computing, 3(1), 18-31. 
[28] Koelstra S., Yazdani A., Soleymani M., Mühl C., Lee J. S., Nijholt A., Patras I. (2010, August). “Single-trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos”, In International Conference on Brain Informatics (pp. 89-100). Springer, Berlin, Heidelberg.
[29] Li M., Lu B. L. (2009, September). “Emotion classification based on gamma-band EEG”, In 2009 Annual International Conference of the IEEE Engineering in medicine and biology society (pp. 1223-1226). IEEE.
[30] Li X., Song D., Zhang P., Yu G., Hou Y., Hu B. (2016, December). “Emotion recognition from multi-channel EEG data through convolutional recurrent neural network”, In 2016 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) (pp. 352-359). IEEE.
[31] Li Y., Zheng W., Wang L., Zong Y., Cui Z. (2019). “From regional to global brain: a novel hierarchical spatial-temporal neural network model for EEG emotion recognition”, IEEE Transactions on Affective Computing.
[32] Liu Y., Ding Y., Li C., Cheng J., Song R., Wan F., Chen X. (2020). “Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network”, Computers in Biology and Medicine, 123, 103927.
[33] Liu Y. H., Wu C. T., Cheng W. T., Hsiao Y. T., Chen P. M., Teng J. T. (2014). “Emotion recognition from single-trial EEG based on kernel Fisher’s emotion pattern and imbalanced quasiconformal kernel support vector machine”, Sensors, 14(8), 13361-13388.
[34] Loconsole C., Chiaradia D., Bevilacqua V., Frisoli A. (2014, August). “Real-time emotion recognition: an improved hybrid approach for classification performance”, In International Conference on Intelligent Computing (pp. 320-331). Springer, Cham.
[35] Mao X., Li Q., Xie H., Lau R. Y., Wang Z., Paul Smolley S. (2017). “Least squares generative adversarial networks”, In Proceedings of the IEEE international conference on computer vision (pp. 2794-2802).
[36] Mirza M., Osindero S. (2014). “Conditional generative adversarial nets”, arXiv preprint arXiv:1411.1784.
[37] Mathersul D., Williams L. M., Hopkinson P. J., Kemp A. H. (2008). “Investigating models of affect: relationships among EEG alpha asymmetry”, depression, and anxiety. Emotion, 8(4), 560. https://psycnet.apa.org/doi/10.1037/a0012811.
[38] Nakatsu R., Nicholson J., Tosa N. (1999, October). “Emotion recognition and its application to computer agents with spontaneous interactive capabilities”, In Proceedings of the seventh ACM international conference on Multimedia (Part 1) (pp. 343-351).
[39] Picard R. W. (2000). “Affective computing”, MIT Press.
[40] Rashid M., Sulaiman N., Mustafa M., Bari B. S., Sadeque M. G., Hasan M. J. (2020). “Wink based facial expression classification using machine learning approach”, SN Applied Sciences, 2(2), 1-9. 
[41] Siami-Namini S., Namin A. S. (2018). “Forecasting economics and financial time series: ARIMA vs”, LSTM. arXiv preprint arXiv:1803.06386.
[42] Sulthan N., Mohan N., Khan K. A., Sofiya S., PP M. S. (2018, April). “Emotion Recognition Using Brain Signals”, In 2018 International Conference on Intelligent Circuits and Systems (ICICS) (pp. 315-319). IEEE.
[43] Tao W., Li C., Song R., Cheng J., Liu Y., Wan F., Chen X. (2020). “EEG-based emotion recognition via channel-wise attention and self attention”, IEEE Transactions on Affective Computing.
[44] Tsiouris K. M., Pezoulas V. C., Zervakis M., Konitsiotis S., Koutsouris D. D., Fotiadis D. I. (2018). “A long short-term memory deep learning network for the prediction of epileptic seizures using EEG signals”, Computers in biology and medicine, 99, 24-37. 
[45] Wei C., Chen L. L., Song Z. Z., Lou X. G., Li D. D. (2020). “EEG-based emotion recognition using simple recurrent units network and ensemble learning”, Biomedical Signal Processing and Control, 58, 101756.
[46] Yan J., Zheng W., Xin M., Yan J. (2014). “Integrating facial expression and body gesture in videos for emotion recognition”, IEICE TRANSACTIONS on Information and Systems, 97(3), 610-613. 
[47] Yang Y. X., Gao Z. K., Wang X. M., Li Y. L., Han J. W., Marwan N., Kurths J. (2018). “A recurrence quantification analysis-based channel-frequency convolutional neural network for emotion recognition from EEG”, Chaos: An Interdisciplinary Journal of Nonlinear Science, 28(8), 085724. 
[48] Zhang L., Mistry K., Neoh S. C., Lim C. P. (2016). “Intelligent facial emotion recognition using moth-firefly optimization”, Knowledge-Based Systems, 111, 248-267.
[49] Zheng W. L., Zhu J. Y., Peng Y., Lu B. L. (2014, July). “EEG-based emotion classification using deep belief networks”, In 2014 IEEE International Conference on Multimedia and Expo (ICME) (pp. 1-6), IEEE.