An Xai-Based CNN-RNN Model for Emotion Recognition with Multi-Channel Signals

Authors

  • Rishi Kumar Sharma, Vivek Kumar, Rajendra Kumar

Keywords:

Emotion detection, time-distribute convolutions, CNN-RNN, Explainable AI learning.

Abstract

Human emotion recognition is a peculiar task. Humans express emotions via facial gestures, body temperature, and brain activity. Interestingly, brain activities can be observed via EEG recordings. The DEAP dataset is a rich source of multimodal physiological signal representation data including EEG recording encapsulating range of stimulated human emotions. In this paper, a novel CNN-RNN architecture is reported to recognize human emotions using the DEAP data. The CNN-RNN model utilizes the concepts of two-dimensional convolutions in a time-distributed fashion at first and later, recurrence exploits the temporal information. The trained model addresses the issues related to synergistic exploitation of temporal and multi-channel information in high- dimensional feature spaces and attempts to improve the recognition performance. The impact of the model in classification performance is explained via the concepts of SHAP explainable AI (XAI) approach. Results indicate improved classification accuracy and SHAP values from the XAI framework indicate the significance of the architecture in achieving satisfactory performance.

Downloads

Download data is not yet available.

References

K. Oatley, Best laid schemes: The psychology of the emotions. Cambridge University Press, 1992.

P. Thagard, Mind: Introduction to cognitive science. MIT press, 2005.

M. C. Nussbaum, Upheavals of Thought: The Intelligence of Emotions. Cambridge University Press, 2001. doi: 10.1017/CBO9780511840715.

J. L. McGaugh, Emotions and bodily responses: A psychophysiological approach. Academic Press, 2013.

S. Z. Li, A. K. Jain, Y.-L. Tian, T. Kanade, and J. F. Cohn, “Facial expression analysis,” Handbook of face recognition, pp. 247–275, 2005.

K. Sailunaz, M. Dhaliwal, J. Rokne, and R. Alhajj, “Emotion detection from text and speech: a survey,” Soc Netw Anal Min, vol. 8, pp. 1–26, 2018.

Y. Wang et al., “A systematic review on affective computing: Emotion models, databases, and recent advances,” Information Fusion, vol. 83, pp. 19–52, 2022.

M. Egger, M. Ley, and S. Hanke, “Emotion recognition from physiological signal analysis: A review,” Electron Notes Theor Comput Sci, vol. 343, pp. 35–55, 2019.

X. Li, D. Song, P. Zhang, G. Yu, Y. Hou, and B. Hu, “Emotion recognition from multi-channel EEG data through convolutional recurrent neural network,” in 2016 IEEE international conference on bioinformatics and biomedicine (BIBM), 2016, pp. 352–359.

S. Roy, I. Kiral-Kornek, and S. Harrer, “ChronoNet: A deep recurrent neural network for abnormal EEG identification,” in Artificial Intelligence in Medicine: 17th Conference on Artificial Intelligence in Medicine, AIME 2019, Poznan, Poland, June 26–29, 2019, Proceedings 17, 2019, pp. 47–56.

A. Supratak, H. Dong, C. Wu, and Y. Guo, “DeepSleepNet: A model for automatic sleep stage scoring based on raw single-channel EEG,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1998–2008, 2017.

P. Bashivan, I. Rish, M. Yeasin, and N. Codella, “Learning representations from EEG with deep recurrent-convolutional neural networks,” arXiv preprint arXiv:1511.06448, 2015.

J. Thomas, L. Comoretto, J. Jin, J. Dauwels, S. S. Cash, and M. Brandon, “EEG Classification via Convolutional Neural Network-Based Interictal Epileptiform Event Detection,” in Conf Proc IEEE Eng Med Biol Soc., 2019, pp. 1–13. doi: 10.1109/EMBC.2018.8512930.EEG.

M. Husken and P. Stagge, “Recurrent neural networks for time series classiÿcation,” Neurocomputing, vol. 50, pp. 223–235, 2003.

S. Ali et al., “Explainable Artificial Intelligence (XAI): What we know and what is left to attain Trustworthy Artificial Intelligence,” Information Fusion, vol. 99, Nov. 2023, doi: 10.1016/j.inffus.2023.101805.

H. Taniguchi, T. Takata, M. Takechi, and A. Furukawa, “Explainable Artificial Intelligence Model for Diagnosis of Atrial Fibrillation Using Holter Electrocardiogram Waveforms,” 2021. doi: 10.1536/ihj.21-094.

M. Ganeshkumar, V. Ravi, V. Sowmya, E. A. Gopalakrishnan, and K. P. Soman, “Explainable Deep Learning-Based Approach for Multilabel Classification of Electrocardiogram,” IEEE Trans Eng Manag, vol. 70, no. 8, pp. 2787–2799, 2023, doi: 10.1109/TEM.2021.3104751.

M. S. Islam, I. Hussain, M. M. Rahman, S. J. Park, and M. A. Hossain, “Explainable Artificial Intelligence Model for Stroke Prediction Using EEG Signal,” Sensors, vol. 22, no. 24, Dec. 2022, doi: 10.3390/s22249859.

I. Hussain et al., “An Explainable EEG-Based Human Activity Recognition Model Using Machine-Learning Approach and LIME,” Sensors, vol. 23, no. 17, Sep. 2023, doi: 10.3390/s23177452.

H. Alsuradi, W. Park, and M. Eid, “Explainable Classification of EEG Data for an Active Touch Task Using Shapley Values,” in Human-Computer Interaction, 2020. doi: 10.1007/978-3-030-60117-1.

K. Zhao, G. S. Member, D. Xu, K. He, and G. Peng, “Interpretable Emotion Classification Using Multidomain Feature of EEG Signals,” IEEE Sens J, vol. 23, no. 11, pp. 11879–11891, 2023, doi: 10.1109/JSEN.2023.3266322.

J. Manuel, M. Torres, S. Medina-devilliers, T. Clarkson, M. D. Lerner, and G. Riccardi, “Evaluation of interpretability for deep learning algorithms in EEG emotion recognition : A case study in autism,” Artif Intell Med, vol. 143, no. May, p. 102545, 2023, doi: 10.1016/j.artmed.2023.102545.

C. A. Ellis, D. A. Carbajal, R. L. Miller, V. D. Calhoun, and M. D. Wang, “An Explainable Deep Learning Approach for Multimodal Electrophysiology Classification,” in bioRxiv, IEEE, 2021, pp. 12–15.

S. Koelstra et al., “DEAP: A database for emotion analysis; Using physiological signals,” IEEE Trans Affect Comput, vol. 3, no. 1, pp. 18–31, Jan. 2012, doi: 10.1109/T-AFFC.2011.15.

J. Zhang, Z. Yin, P. Chen, and S. Nichele, “Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review,” Information Fusion, vol. 59, pp. 103–126, Jul. 2020, doi: 10.1016/J.INFFUS.2020.01.011.

V. Doma and M. Pirouz, “A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals,” J Big Data, vol. 7, no. 1, Dec. 2020, doi: 10.1186/s40537-020-00289-7.

A. Tripathi and T. Choudhury, “Permuted layer-based CNN for Emotion Detection with Multi-Modality Physiological Signals,” in 2023 IEEE International Conference on Contemporary Computing and Communications (InC4), 2023, pp. 1–5. doi: 10.1109/InC457730.2023.10263176.

Downloads

Published

12.06.2024

How to Cite

Rishi Kumar Sharma. (2024). An Xai-Based CNN-RNN Model for Emotion Recognition with Multi-Channel Signals. International Journal of Intelligent Systems and Applications in Engineering, 12(4), 4968 –. Retrieved from https://www.ijisae.org/index.php/IJISAE/article/view/7246

Issue

Section

Research Article