Inter-brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging during Video Watching

Published in IEEE Transactions on Affective Computing, 2018

Abstract: How to efficiently tag the emotional experience of multimedia contents is an important and challenging problem in the field of affective computing. This paper presents an EEG-based real-time emotion tagging approach, by extracting inter-brain features from a group of participants when they watch the same emotional video clips. First, the continuous subjective reports on both the arousal and valence dimensions of emotion were obtained by employing a three-round behavioral rating paradigm. Second, the inter-brain features were systematically explored in both spectral and temporal domain. Finally, regression analyses were performed to evaluate the effectiveness of inter-brain amplitude and phase features. The inter-brain amplitude feature showed significantly better prediction performance than the inter-brain phase feature, as well as another two conventional features (spectral power and inter-subject correlation). By combining the four types of features, regression values (R2) were obtained for the prediction of arousal (0.61 + 0.01) and valence (0.70 + 0.01), corresponding to prediction errors of 1.01 + 0.02 and 0.78 + 0.02 (unit on 9-point scales), respectively. The contributions of different electrodes and frequency bands were also analyzed. Our results show promising potentials of inter-brain EEG features in real-time emotion tagging applications.

Download paper here

More information

Recommended citation: Yue Ding, Xin Hu, Zhenyi Xia, Yong-Jin Liu, Dan Zhang. Inter-brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging during Video Watching. IEEE Transactions on Affective Computing. Vol. 12, No. 1, pp. 92-102, January-March, 2021.