Abstract
Emotion recognition systems have been developed to assess human emotional states during different experiences. In this paper, an approach is proposed for recognizing music-induced emotions through the fusion of three-channel forehead biosignals (the left temporalis, frontalis, and right temporalis channels) and an electrocardiogram. The classification of four emotional states in an arousal–valence space (positive valence/low arousal, positive valence/high arousal, negative valence/high arousal, and negative valence/low arousal) was performed by employing two parallel support vector machines as arousal and valence classifiers. The inputs of the classifiers were obtained by applying a fuzzy-rough model feature evaluation criterion and sequential forward floating selection algorithm. An average classification accuracy of 88.78 % was achieved, corresponding to an average valence classification accuracy of 94.91 % and average arousal classification accuracy of 93.63 %. The proposed emotion recognition system may be useful for interactive multimedia applications or music therapy.
Similar content being viewed by others
References
Barkišli M. Les idées scientifiques de Farabi dans la musique. Pažūhišgāh-i Mūsīqī-šināsī-i Īrān; 1978.
Aldridge D. An overview of music therapy research. Complementary Ther Med. 1994;2:204–16.
Trainor LJ, Schmidt LA. Processing emotions induced by music. In: Peretz I, Zatorre R, editors. The cognitive neuroscience of music. Oxford: Oxford University Press; 2003. p. 310–24.
Sammler D, Grigutsch M, Fritz T, Koelsch S. Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology. 2007;44:293–304.
Pavlygina RA, Sakharov DS, Davydov VI. Spectral analysis of the human EEG during listening to musical compositions. Hum Physiol. 2004;30:54–60.
Knight WEJ, Rickard NS. Relaxing music prevents stress-induced increases in subjective anxiety, systolic blood pressure, and heart rate in healthy males and females. J Music Ther. 2001;38:254–72.
Bernardi L, Porta C, Sleight C. Cardiovascular, cerebrovascular, and respiratory changes induced by different types of music in musicians and non-musicians: the importance of silence. Heart. 2006;92:459–70.
Kallinen K. Emotion related psychological responses to listening to music with eyes-open versus eyes-closed: electrodermal (EDA), electrocardiac (ECG), and electromyographic (EMG) measures. In: Proceedings of 8th international conference on music perception and cognition. 2004. p. 299–301.
McFarland RA. Relationship of skin temperature changes to the emotions accompanying music. Biofeedback Self Regul. 1985;10:255–67.
Janssen JH, Van den Broek EL, Westerink JHDM. Personalized affective music player. In: Proceedings of IEEE 3rd international conference on affective computing and intelligent interaction. Eindhoven. 2009. p. 1–6.
Kim J, André E. Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell. 2008;30:2067–83.
Lin YP, Wang CH, Jung TP, Wu TL, Jeng SK, Duann JR, Chen JH. EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng. 2010;57:1798–806.
Firoozabadi SMP, Oskoei MRA, Hu H. A Human–Computer interface based on forehead Multi-Channel bio-signals to control a virtual wheelchair. In: Proceedings of 14th Iranian conference on biomedical engineering, Tehran. 2008. p. 108–113.
Rezazadeh IM, Wang X, Firoozabadi M, Golpayegani MRH. Using affective human–machine interface to increase the operation performance in virtual construction crane training system: a novel approach. Autom Constr. 2011;20:289–98.
Rad RH, Firoozabadi M, Rezazadeh IM. Discriminating affective states in music induction environment using forehead bioelectric signals. In: Proceedings of 1st middle east conference on biomedical engineering, Sharjah. 2011. p. 343–346.
Ortony A, Clore GL, Collins A. The cognitive structures of emotions. Cambridge: Cambridge University Press; 1990.
Beigand E, Viellard S, Madurell F, Marozeau J, Dacquet A. Multidimensional scaling of emotional responses to music: the effect of musical expertise and of the duration of the excerpts. Cogn Emot. 2005;19:1113–39.
Juslin PN, Västfjäll D. Emotional responses to music: the need to consider underlying mechanisms. Behav Brain Sci. 2008;31:559–621.
Konečni VJ. Does music induce emotions? A theoretical and methodological analysis. Psychol Aesthet Creat Arts. 2008;2:115–29.
Cambria E, Livingstone A, Hussain A. The hourglass of emotions. In: Esposito A, Esposito AM, Vinciareli A, Hoffmann R, Muller VC, editors. Cognitive behavioural systems. Berlin: Springer; 2012. p. 144–57.
Schlosberg H. Three dimensions of emotion. Psychol Rev. 1954;61:81–8.
Russel JA. A circumplex model of affect. J Pers Soc Psychol. 1980;39:1161–78.
Flores-Gutiérrez EO, Díaz JL, Barrios FA, Favila-Humara R, Guevara MA, Del Río-Portilla Y, Corsi-Cabrea M. Metabolic and electric brain patterns during pleasant and unpleasant emotions induced by music masterpieces. Int J Psychophysiol. 2007;65:69–84.
Pop-Jordanova N, Pop-Jordanova J. Spectrum-weighted EEG frequency (“brain-rate”) as a quantitative indicator of mental arousal. Prilozi. 2005;26:35–42.
Kaiser JF. On a simple algorithm to calculate the ‘energy’ of a signal. In: Proceedings of IEEE ICASSP’90, New Mexico. 1990. p. 381–384.
Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed. 2010;14:186–97.
Acharya UR, Joseph KP, Kannathal N, Lim CM, Suri JS. Heart rate variability: a review. Med Biol Eng Comput. 2006;44:1031–51.
Dabanloo NJ, Moharreri S, Parvaneh S, Nasrabadi AM. Application of novel mapping for heart rate phase space and its role in cardiac arrhythmia diagnosis. In: Computers in cardiology, Belfast. 2010. p. 209–212.
Hu Q, Xie Z, Yu D. Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation. Pattern Recogn. 2007;40:3509–21.
Theodoridis S, Koutroumbas K. Pattern recognition. 3rd ed. San Diego: Academic Press; 2006.
Grassi M, Cambria E, Hussain A, Piazza F. Sentic web: a new paradigm for managing social media affective information. Cognit Comput. 2011;3:480–9.
Poria S, Gelbukh A, Hussain A, Howard N, Das D, Bandyopadhyay S. Enhanced senticNet with affective labels for concept-based opinion mining. IEEE Intell Syst. 2013;28:31–8.
Lang PJ, Greenwald MK, Bradley MM, Hamm AO. Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology. 1993;30:261–73.
Liu Y, Sourina O, Nguyen MK. Real-time EEG-based human emotion recognition and visualization. In: Proceedings of international conference on cyberworlds, Singapore. 2010. p. 262–9.
Soleymani M, Pantic M, Pun T. Emotion recognition in response to videos. IEEE Trans Affect Comput. 2012;3:211–23.
Khosrowabadi R, Heijnen M, Wahab A, Quek HC. The dynamic emotion recognition system based on functional connectivity of brain regions. In: Proceedings of IEEE intelligent vehicles symposium, San Diego. 2010. p. 377–381.
Acknowledgments
We gratefully acknowledge the assistance of Ms. Atena Bajoulvand for her help with the collection of the data of female subjects. The authors would also like to thank the anonymous reviewers for their insightful comments.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Naji, M., Firoozabadi, M. & Azadfallah, P. Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram. Cogn Comput 6, 241–252 (2014). https://doi.org/10.1007/s12559-013-9239-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-013-9239-7