Skip to main content

Emotion Recognition from Physiological Signals Using Continuous Wavelet Transform and Deep Learning

  • Conference paper
  • First Online:
HCI International 2022 - Late Breaking Papers. Multimodality in Advanced Interaction Environments (HCII 2022)

Abstract

In recent years, emotion recognition has received increasing attention as it plays an essential role in human-computer interaction systems. This paper proposes a four-class multimodal approach for emotion recognition based on peripheral physiological signals that uniquely combines a Continuous Wavelet Transform (CWT) for feature extraction, an overlapping sliding window approach to generate more data samples and a Convolutional Neural Network (CNN) model for classification. The proposed model processes multiple signal types such as Galvanic Skin Response (GSR), respiration patterns, and blood volume pressure. Achieved results indicate an accuracy of 84.2%, which outperforms state-of-the-art models on four-class classification despite of being only based on peripheral signals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Liao, J., Zhong, Q., Zhu, Y., Cai, D.: Multimodal physiological signal emotion recognition based on convolutional recurrent neural network. In: IOP Conference Series: Materials Science and Engineering, pp. 032005. IOP Publishing (2020)

    Google Scholar 

  2. Zhao, Y., Cao, X., Lin, J., Yu, D., Cao, X.: Multimodal affective states recognition based on multiscale CNNs and biologically inspired decision fusion model. IEEE Transactions on Affective Computing (2021)

    Google Scholar 

  3. Puce, A., Hämäläinen, M.S.: A review of issues related to data acquisition and analysis in EEG/MEG studies. Brain Sci. 7(6), 58 (2017)

    Article  Google Scholar 

  4. Hassan, M.M., Alam, M.G.R., Uddin, M.Z., Huda, S., Almogren, A., Fortino, G.: Human emotion recognition using deep belief network architecture. Inf. Fusion 51, 10–18 (2019)

    Article  Google Scholar 

  5. Russell, J.A., Mehrabian, A.: Evidence for a three-factor theory of emotions. J. Res. Pers. 11(3), 273–294 (1997)

    Article  Google Scholar 

  6. Soleymani, M., Pantic, M. and Pun, T.: Multimodal emotion recognition in response to videos (Extended abstract). In: 2015 International Conference on Affective Computing and Intelligent Interaction, pp. 491–497 ACII (2015)

    Google Scholar 

  7. Russell, J.A.: Culture and the categorization of emotions. Psychol. Bull. 110(3), 426–450 (1991)

    Article  Google Scholar 

  8. Kwon, Y.H., Shin, S.B., Kim, S.D.: Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system. Sensors 18(5), 1383 (2018)

    Article  Google Scholar 

  9. Wu, D., Zhang, J., Zhao, Q.: Multimodal fused emotion recognition about expression-EEG interaction and collaboration using deep learning. IEEE Access 8, 133180–133189 (2020)

    Article  Google Scholar 

  10. Alharbey, R.A., Alsubhi, S., Daqrouq, K., Alkhateeb, A.: The continuous wavelet transform using for natural ECG signal arrhythmias detection by statistical parameters. Alex. Eng. J. 61(12), 9243–9248 (2022)

    Article  Google Scholar 

  11. Boronoyev, V.V., Garmaev, B.Z., Lebedintseva, I.V.: The features of continuous wavelet transform for physiological pressure signal. In: Fourteenth International Symposium on Atmospheric and Ocean Optics/Atmospheric Physics, pp. 693611. International Society for Optics and Photonics (2008)

    Google Scholar 

  12. Long, Z., Liu, G., Dai, X.: Extracting emotional features from ECG by using wavelet transform. In: 2010 International Conference on Biomedical Engineering and Computer Science, pp. 1–4. IEEE (2010)

    Google Scholar 

  13. Cheng, B., Liu, G.: Emotion recognition from surface EMG signal using wavelet transform and neural network. In Proceedings of the 2nd international conference on bioinformatics and biomedical engineering, pp. 1363–1366. ICBBE (2008)

    Google Scholar 

  14. Verma, G.K., Tiwary, U.S.: Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals. Neuroimage 102, 162–172 (2014)

    Article  Google Scholar 

  15. Ma, J., Tang, H., Zheng, W.L., Lu, B.L.: Emotion recognition using multimodal residual LSTM network. In: Proceedings of the 27th ACM International Conference on Multimedia, pp. 176–183. ACM (2019)

    Google Scholar 

  16. Mei, H., Xu, X.: EEG-based emotion classification using convolutional neural network. In: 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), pp. 130–135. IEEE (2017)

    Google Scholar 

  17. Lin, W., Li, C. and Sun, S.: Deep convolutional neural network for emotion recognition using EEG and peripheral physiological signal. In: International Conference on Image and Graphics, pp. 385–394. Springer, Cham (2017)

    Google Scholar 

  18. Liu, N., Fang, Y., Li, L., Hou, L., Yang, F., Guo, Y.: Multiple feature fusion for automatic emotion recognition using EEG signals. In: 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 896–900. IEEE (2018)

    Google Scholar 

  19. da Silva, M.A.F., de Carvalho, R.L., da Silva Almeida, T.: Evaluation of a Sliding Window mechanism as DataAugmentation over Emotion Detection on Speech. Acad. J. Comput. Eng. Appl. Math. 2(1), 11–18 (2021)

    Google Scholar 

  20. Garg, S., Patro, R.K., Behera, S., Tigga, N.P., Pandey, R.: An overlapping sliding window and combined features based emotion recognition system for EEG signals. Appl. Comput. Inform. (2021)

    Google Scholar 

  21. Zhou, J., Wei, X., Cheng, C., Yang, Q., Li, Q.: Multimodal emotion recognition method based on convolutional auto-encoder. Int. J. Comput. Intell. Syst. 12(1), 351–358 (2019)

    Article  Google Scholar 

  22. Karyana, D.N., Wisesty, U.N., Nasri, J.: Klasifikasi sinyal EEG menggunakan deep neural network dengan stacked denoising autoencoder. eProc. Eng. 3(3), 5296–5303 (2016)

    Google Scholar 

  23. Koelstra, S., et al.: Deap: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)

    Article  Google Scholar 

  24. Zhang, X.-Y., Wang, W.-R., Shen, C.-Y., Sun, Y., Huang, L.-X.: Extraction of EEG components based on time - frequency blind source separation. In: Pan, J.-S., Tsai, P.-W., Watada, J., Jain, L.C. (eds.) IIH-MSP 2017. SIST, vol. 82, pp. 3–10. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-63859-1_1

    Chapter  Google Scholar 

  25. Sanjar, K., Rehman, A., Paul, A., JeongHong, K.: Weight dropout for preventing neural networks from overfitting. In: Proceedings of the 8th International Conference on Orange Technology (ICOT), pp. 1–4. IEEE (2020)

    Google Scholar 

  26. Zhang, Y., Cheng, C., Zhang, Y.: Multimodal emotion recognition using a hierarchical fusion convolutional neural network. IEEE Access 9, 7943–7951 (2021)

    Article  Google Scholar 

  27. Martínez-Rodrigo, A., García-Martínez, B., Alcaraz, R., Fernández-Caballero, A., González, P.: Study of electroencephalographic signal regularity for automatic emotion recognition. In: Ochoa, S.F., Singh, P., Bravo, J. (eds.) UCAmI 2017. LNCS, vol. 10586, pp. 766–777. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67585-5_74

    Chapter  Google Scholar 

  28. Bagherzadeh, S., Maghooli, K., Farhadi, J., Zangeneh Soroush, M.: Emotion recognition from physiological signals using parallel stacked autoencoders. Neurophysiology, 50(6), 428–435 (2018)

    Google Scholar 

  29. Huang, H., Hu, Z., Wang, W., Wu, M.: Multimodal emotion recognition based on ensemble convolutional neural network. IEEE Access 8, 3265–3271 (2019)

    Article  Google Scholar 

Download references

Acknowledgement

The work is supported in part by the ‘MELANIE’ project funded by the European Regional Development Fund (ERDF), project No. FESR1138.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lana Jalal .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jalal, L., Peer, A. (2022). Emotion Recognition from Physiological Signals Using Continuous Wavelet Transform and Deep Learning. In: Kurosu, M., et al. HCI International 2022 - Late Breaking Papers. Multimodality in Advanced Interaction Environments. HCII 2022. Lecture Notes in Computer Science, vol 13519. Springer, Cham. https://doi.org/10.1007/978-3-031-17618-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-17618-0_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-17617-3

  • Online ISBN: 978-3-031-17618-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics