Skip to main content

Advertisement

Log in

Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Emotion recognition systems have been developed to assess human emotional states during different experiences. In this paper, an approach is proposed for recognizing music-induced emotions through the fusion of three-channel forehead biosignals (the left temporalis, frontalis, and right temporalis channels) and an electrocardiogram. The classification of four emotional states in an arousal–valence space (positive valence/low arousal, positive valence/high arousal, negative valence/high arousal, and negative valence/low arousal) was performed by employing two parallel support vector machines as arousal and valence classifiers. The inputs of the classifiers were obtained by applying a fuzzy-rough model feature evaluation criterion and sequential forward floating selection algorithm. An average classification accuracy of 88.78 % was achieved, corresponding to an average valence classification accuracy of 94.91 % and average arousal classification accuracy of 93.63 %. The proposed emotion recognition system may be useful for interactive multimedia applications or music therapy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Barkišli M. Les idées scientifiques de Farabi dans la musique. Pažūhišgāh-i Mūsīqī-šināsī-i Īrān; 1978.

  2. Aldridge D. An overview of music therapy research. Complementary Ther Med. 1994;2:204–16.

    Article  Google Scholar 

  3. Trainor LJ, Schmidt LA. Processing emotions induced by music. In: Peretz I, Zatorre R, editors. The cognitive neuroscience of music. Oxford: Oxford University Press; 2003. p. 310–24.

    Chapter  Google Scholar 

  4. Sammler D, Grigutsch M, Fritz T, Koelsch S. Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology. 2007;44:293–304.

    Article  PubMed  Google Scholar 

  5. Pavlygina RA, Sakharov DS, Davydov VI. Spectral analysis of the human EEG during listening to musical compositions. Hum Physiol. 2004;30:54–60.

    Article  Google Scholar 

  6. Knight WEJ, Rickard NS. Relaxing music prevents stress-induced increases in subjective anxiety, systolic blood pressure, and heart rate in healthy males and females. J Music Ther. 2001;38:254–72.

    Article  CAS  PubMed  Google Scholar 

  7. Bernardi L, Porta C, Sleight C. Cardiovascular, cerebrovascular, and respiratory changes induced by different types of music in musicians and non-musicians: the importance of silence. Heart. 2006;92:459–70.

    Google Scholar 

  8. Kallinen K. Emotion related psychological responses to listening to music with eyes-open versus eyes-closed: electrodermal (EDA), electrocardiac (ECG), and electromyographic (EMG) measures. In: Proceedings of 8th international conference on music perception and cognition. 2004. p. 299–301.

  9. McFarland RA. Relationship of skin temperature changes to the emotions accompanying music. Biofeedback Self Regul. 1985;10:255–67.

    Article  CAS  PubMed  Google Scholar 

  10. Janssen JH, Van den Broek EL, Westerink JHDM. Personalized affective music player. In: Proceedings of IEEE 3rd international conference on affective computing and intelligent interaction. Eindhoven. 2009. p. 1–6.

  11. Kim J, André E. Emotion recognition based on physiological changes in music listening. IEEE Trans Pattern Anal Mach Intell. 2008;30:2067–83.

    Article  PubMed  Google Scholar 

  12. Lin YP, Wang CH, Jung TP, Wu TL, Jeng SK, Duann JR, Chen JH. EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng. 2010;57:1798–806.

    Article  PubMed  Google Scholar 

  13. Firoozabadi SMP, Oskoei MRA, Hu H. A Human–Computer interface based on forehead Multi-Channel bio-signals to control a virtual wheelchair. In: Proceedings of 14th Iranian conference on biomedical engineering, Tehran. 2008. p. 108–113.

  14. Rezazadeh IM, Wang X, Firoozabadi M, Golpayegani MRH. Using affective human–machine interface to increase the operation performance in virtual construction crane training system: a novel approach. Autom Constr. 2011;20:289–98.

    Article  Google Scholar 

  15. Rad RH, Firoozabadi M, Rezazadeh IM. Discriminating affective states in music induction environment using forehead bioelectric signals. In: Proceedings of 1st middle east conference on biomedical engineering, Sharjah. 2011. p. 343–346.

  16. Ortony A, Clore GL, Collins A. The cognitive structures of emotions. Cambridge: Cambridge University Press; 1990.

    Google Scholar 

  17. Beigand E, Viellard S, Madurell F, Marozeau J, Dacquet A. Multidimensional scaling of emotional responses to music: the effect of musical expertise and of the duration of the excerpts. Cogn Emot. 2005;19:1113–39.

    Article  Google Scholar 

  18. Juslin PN, Västfjäll D. Emotional responses to music: the need to consider underlying mechanisms. Behav Brain Sci. 2008;31:559–621.

    Article  PubMed  Google Scholar 

  19. Konečni VJ. Does music induce emotions? A theoretical and methodological analysis. Psychol Aesthet Creat Arts. 2008;2:115–29.

    Article  Google Scholar 

  20. Cambria E, Livingstone A, Hussain A. The hourglass of emotions. In: Esposito A, Esposito AM, Vinciareli A, Hoffmann R, Muller VC, editors. Cognitive behavioural systems. Berlin: Springer; 2012. p. 144–57.

    Chapter  Google Scholar 

  21. Schlosberg H. Three dimensions of emotion. Psychol Rev. 1954;61:81–8.

    Article  CAS  PubMed  Google Scholar 

  22. Russel JA. A circumplex model of affect. J Pers Soc Psychol. 1980;39:1161–78.

    Article  Google Scholar 

  23. Flores-Gutiérrez EO, Díaz JL, Barrios FA, Favila-Humara R, Guevara MA, Del Río-Portilla Y, Corsi-Cabrea M. Metabolic and electric brain patterns during pleasant and unpleasant emotions induced by music masterpieces. Int J Psychophysiol. 2007;65:69–84.

    Article  PubMed  Google Scholar 

  24. Pop-Jordanova N, Pop-Jordanova J. Spectrum-weighted EEG frequency (“brain-rate”) as a quantitative indicator of mental arousal. Prilozi. 2005;26:35–42.

    PubMed  Google Scholar 

  25. Kaiser JF. On a simple algorithm to calculate the ‘energy’ of a signal. In: Proceedings of IEEE ICASSP’90, New Mexico. 1990. p. 381–384.

  26. Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed. 2010;14:186–97.

    Article  PubMed  Google Scholar 

  27. Acharya UR, Joseph KP, Kannathal N, Lim CM, Suri JS. Heart rate variability: a review. Med Biol Eng Comput. 2006;44:1031–51.

    Article  Google Scholar 

  28. Dabanloo NJ, Moharreri S, Parvaneh S, Nasrabadi AM. Application of novel mapping for heart rate phase space and its role in cardiac arrhythmia diagnosis. In: Computers in cardiology, Belfast. 2010. p. 209–212.

  29. Hu Q, Xie Z, Yu D. Hybrid attribute reduction based on a novel fuzzy-rough model and information granulation. Pattern Recogn. 2007;40:3509–21.

    Article  Google Scholar 

  30. Theodoridis S, Koutroumbas K. Pattern recognition. 3rd ed. San Diego: Academic Press; 2006.

    Google Scholar 

  31. Grassi M, Cambria E, Hussain A, Piazza F. Sentic web: a new paradigm for managing social media affective information. Cognit Comput. 2011;3:480–9.

    Article  Google Scholar 

  32. Poria S, Gelbukh A, Hussain A, Howard N, Das D, Bandyopadhyay S. Enhanced senticNet with affective labels for concept-based opinion mining. IEEE Intell Syst. 2013;28:31–8.

    Article  Google Scholar 

  33. Lang PJ, Greenwald MK, Bradley MM, Hamm AO. Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology. 1993;30:261–73.

    Article  CAS  PubMed  Google Scholar 

  34. Liu Y, Sourina O, Nguyen MK. Real-time EEG-based human emotion recognition and visualization. In: Proceedings of international conference on cyberworlds, Singapore. 2010. p. 262–9.

  35. Soleymani M, Pantic M, Pun T. Emotion recognition in response to videos. IEEE Trans Affect Comput. 2012;3:211–23.

    Article  Google Scholar 

  36. Khosrowabadi R, Heijnen M, Wahab A, Quek HC. The dynamic emotion recognition system based on functional connectivity of brain regions. In: Proceedings of IEEE intelligent vehicles symposium, San Diego. 2010. p. 377–381.

Download references

Acknowledgments

We gratefully acknowledge the assistance of Ms. Atena Bajoulvand for her help with the collection of the data of female subjects. The authors would also like to thank the anonymous reviewers for their insightful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohsen Naji.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Naji, M., Firoozabadi, M. & Azadfallah, P. Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram. Cogn Comput 6, 241–252 (2014). https://doi.org/10.1007/s12559-013-9239-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-013-9239-7

Keywords

Navigation