Skip to main content

Advertisement

Log in

Affective analysis of patients in homecare video-assisted telemedicine using computational intelligence

  • S.I. : Emerging applications of Deep Learning and Spiking ANN
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

The affective/emotional status of patients is strongly connected to the healing process and their health. Therefore, being aware of the psychological peaks and troughs of a patient provides the advantage of timely intervention by specialists or closely related kinsfolk. In this context, this paper presents the design and implementation of an emotion analysis module integrated in an existing telemedicine platform. Two different methodologies are utilized and discussed. The first scheme exploits the fast and consistent properties of the speeded-up robust features algorithm in order to identify the existence of seven different sentiments in human faces. The second is based on convolutional neural networks. The whole functionality is provided as a Web service for the healthcare platform during regular video teleconference sessions between authorized medical personnel and patients. The paper discusses the technical details of the implementation and the incorporation of the proposed scheme and provides the initial results of its accuracy and operation in practice.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Ayadi MM, Kamel MS, Karray F (2011) Survey on speech emotion recognition: features, classification schemes, and databases. Pattern Recognit 44:572–587

    Article  Google Scholar 

  2. Bay H, Tuytelaars T, Gool VG (2008) Speeded up robust features. Comput Vis Image Underst 110(3):346–359

    Article  Google Scholar 

  3. Bouchiha R, Besbes K (2013) Automatic remote-sensing image registration using SURF. Int J Comput Theory Eng 5(1):88–92

    Article  Google Scholar 

  4. Buades A, Coll B, Morel JM (2005) A non-local algorithm for image denoising. In: Proceedings of the 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR'05), vol 2, pp 60–65

  5. Chen M, Zhang Y, Li Y, Hassan MM, Alamri A (2015) AIWAC: affective interaction through wearable computing and cloud technology. IEEE Wirel Commun 22(1):20–27

    Article  Google Scholar 

  6. Ciresan DC, Meier U, Masci J, Gambardella LM, Schmidhuber J (2011) Flexible, high performance convolutional neural networks for image classification. In Twenty-second international joint conference on artificial intelligence

  7. Collobert R, Kavukcuoglu K, Farabet C (2011) Torch7: a matlab-like environment for machine learning. In: NIPS 2011

  8. Ekman P, Davidson RJ (eds) (1994) Series in affective science. The nature of emotion: fundamental questions. Oxford University Press, Oxford

    Google Scholar 

  9. Filntisis PP, Efthymiou N, Koutras P, Potamianos G, Maragos P (2019) Fusing body posture with facial expressions for joint recognition of affect in child–robot interaction. IEEE Robot Autom Lett 4:4011–4018

    Article  Google Scholar 

  10. Forstall S, Chaudhri I, Chaudhri IA (2006) Webview applications. U.S. Patent Application 11/145,560

  11. Fridenson-Hayo S, Berggren S, Lassalle A et al (2017) 'Emotiplay': a serious game for learning about emotions in children with autism: results of a cross-cultural evaluation. Eur Child Adolesc Psychiatry 26:979–992

    Article  Google Scholar 

  12. Gadaf R, Besar B (2017) The effects of emotional intelligence on employees performance. Int J Bus Glob 18(4):467–479

    Article  Google Scholar 

  13. Ghimire D, Lee J (2013) Geometric feature-based facial expression recognition in image sequences using multi-class adaboost and support vector machines. Sensors 13(6):7714–7734. https://doi.org/10.3390/s130607714

    Article  Google Scholar 

  14. Goodfellow IJ, Erhan D, Carrier PL, Courville A, Mirza M, Hammer B, Zhou Y (2013) Challenges in representation learning: a report on three machine learning contents. In: International conference on neural information processing, pp 117–124

  15. Gudi A, Tasli HE, Uyl TM, Maroulis A (2015) Deep learning based FACS Action Unit occurrence and intensity estimation. In: 2015 11th IEEE international conference and workshops on automatic face and gesture recognition (FG), vol 06, pp 1–5

  16. Gulli A, Sujit P (2017) Deep learning with Keras. Packt Publishing Ltd, Birmingham

    Google Scholar 

  17. Holmgård C, Yannakakis G, Karstoft KI, Andersen H (2013) Stress detection for PTSD via the StartleMart Game. In: Proceedings—2013 humane association conference on affective computing and intelligent interaction, ACII 2013, pp 523–528

  18. Howard AG, Zhu M, Chen B, Kalenichenko D, Wang W, Weyand T, Andreetto M, Adam H (2017) MobileNets: efficient convolutional neural networks for mobile vision applications. arXiv:1704.04861

  19. Izquierdo-Reyes J, Ramirez-Mendoza RA, Bustamante-Bello MR, Navarro-Tuch S, Avila-Vázquez R (2018) Advanced driver monitoring for assistance system (ADMAS). Int J Interact Des Manuf 12:187–197

    Article  Google Scholar 

  20. Juan L, Gwun O (2010) SURF applied in panorama image stitching. In: 2010 2nd international conference on image processing theory, tools and applications, pp 495–499

  21. Kallipolitis A, Galliakis M, Menychtas A, Maglogiannis I (2019) Emotion analysis in hospital bedside infotainment platforms using speeded up robust features. In: 15th IFIP international conference on artificial intelligence applications and innovations (AIAI), pp 127–138

  22. Katarya R, Verma O (2016) Recent developments in affective recommender systems. Phys A Stat Mech Appl 461:182–190

    Article  Google Scholar 

  23. Ko B (2018) A brief review of facial emotion recognition based on visual information. Sensors 18(2):401. https://doi.org/10.3390/s18020401

    Article  Google Scholar 

  24. Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16:172–187

    Article  MathSciNet  Google Scholar 

  25. Krakovsky M (2018) Artificial (emotional) intelligence. Commun ACM 61:18–19

    Article  Google Scholar 

  26. Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: NIPS

  27. Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, Van Knippenberg A (2010) Presentation and validation of the Radboud Faces Database. Cogn Emot 24(8):1377–1388. https://doi.org/10.1080/02699930903485076

    Article  Google Scholar 

  28. Lazebnik S, Schmid C, Ponce J (2009) Spatial pyramid matching. In: Object categorization: computer and human vision perspectives, vol 9780521887380, Cambridge University Press, pp 401–415. https://doi.org/10.1017/CBO9780511635465.022

  29. Lee CM, Yildirim S, Bulut M, Kazemzadeh A, Busso C, Deng Z, Lee S, Narayanan SS (2004) Emotion recognition based on phoneme classes. To appear in Proc. ICSLP’04

  30. Liu Z, Wu M, Cao W, Mao J, Xu J, Tan G (2018) Speech emotion recognition based on feature selection and extreme learning machine decision tree. Neurocomputing 273:271–280

    Article  Google Scholar 

  31. Lopez-de-la-Calleja M, Nagai T, Attamimi M, Nakano-Miyatake M, Perez-Meana H (2013) Object detection using SURF and superpixels. J Softw Eng Appl 06:511–518

    Article  Google Scholar 

  32. Lowe D (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  33. Lucey P, Cohn JF, Kanade T, Saragih JM, Ambadar Z, Matthews IA (2010) The Extended Cohn-Kanade Dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE computer society conference on computer vision and pattern recognition—workshops, pp 94-101

  34. Lyons MJ, Akamatsu S, Kamachi M, Gyoba J (1998) Coding facial expressions with Gabor wavelets. In: Proceedings third IEEE international conference on automatic face and gesture recognition, pp 200–205

  35. Martín A, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M et al (2016) Tensorflow: a system for large-scale machine learning. In: 12th symposium on operating systems design and implementation, vol 16, pp 265–283

  36. Mavridou I, McGhee J, Hamedi M, Fatoorechi M, Cleal A, Balaguer-Ballester E, Seiss E, Cox G, Nduka C (2017) FACETEQ: a novel platform for measuring emotion in VR. In: 2017 IEEE virtual reality (VR), pp 441–442

  37. Menychtas A, Galliakis M, Tsanakas P, Maglogiannis I (2019) Real-time integration of emotion analysis into homecare platforms, pp 3468–3471. https://doi.org/10.1109/EMBC.2019.8857484

  38. Noroozi F, Corneanu C A, Kaminska D, Sapinski T, Escalera S, Anbarjafari G (2018) Survey on emotional body gesture recognition. arXiv:1801.07481

  39. Panagopoulos C, Menychtas A, Fouskas G, Plagianakos V, Maglogiannis I, Delimpasis K, Galliakis M, Petropoulos D, Gkartzios C, Koumpoulis C (2019) A smart infotainment system equipped with emotional intelligence. Stud Health Technol inform 262:214–217

    Google Scholar 

  40. Panagopoulos C, Menychtas A, Tsanakas P, Maglogiannis I (2019) Increasing usability of homecare applications for older adults: a case study. Designs 3(2):23. https://doi.org/10.3390/designs3020023

    Article  Google Scholar 

  41. Pizer SM, Amburn EP, Austin JD et al (1987) Adaptive histogram equalization and its variations. Comput Vis Graph Image Process 39:355–368

    Article  Google Scholar 

  42. Rami AR, Alain G, Almahairi A, Angermueller C, Bahdanau D, Ballas N, Bastien F et al (2016) Theano: a python framework for fast computation of mathematical expressions. arXiv:1605.02688

  43. Rao Q, Qu X, Mao Q, Zhan Y (2015) Multi-pose facial expression recognition based on SURF boosting. In: 2015 international conference on affective computing and intelligent interaction (ACII), pp 630–635

  44. Rousseeuw P (1987) Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. J Comput Appl Math 20:53–65

    Article  Google Scholar 

  45. Serengil SI (2019) Facial expression recognition with keras. https://sefiks.com/2018/01/01/facial-expression-recognition-with-keras/. Accessed 19 Nov 2019

  46. Spyrou E, Nikopoulou R, Vernikos I, Mylonas P (2019) Emotion recognition from speech using the bag-of-visual words on audio segment spectrograms. Technologies 7(1):20. https://doi.org/10.3390/technologies7010020

    Article  Google Scholar 

  47. Szegedy C, Ioffe S, Vanhoucke V, Alemi A (2017) Inception-v4, inception-resnet and the impact of residual connections on learning. In: AAAI conference on artificial intelligence. https://www.aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14806

  48. Tivatansakul S, Ohkura M, Puangpontip S, Achalakul T (2014) Emotional healthcare system: emotion detection by facial expressions using Japanese database. In: 2014 6th computer science and electronic engineering conference (CEEC), pp 41–46

  49. Wang S, Phillips P, Dong Z, Zhang Y (2018) Intelligent facial emotion recognition based on stationary wavelet entropy and Jaya algorithm. Neurocomputing 272:668–676

    Article  Google Scholar 

  50. Wei GY, Brooks D (2019) Benchmarking tpu, gpu, and cpu platforms for deep learning. arXiv:1907.10701

  51. Xu C, Cetintas S, Lee KC, Li LJ (2014) Visual sentiment prediction with deep convolutional neural networks. https://arxiv.org/abs/1411.5731v1

  52. Yamauchi T, Xiao K (2018) Reading emotion from mouse cursor motions: affective computing approach. Cogn Sci 42:771–819

    Article  Google Scholar 

  53. Yannakakis GN (2018) Enhancing health care via affective computing. Malta J Health Sci 5:38

    Google Scholar 

Download references

Acknowledgements

This research has been co‐financed by the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH—CREATE—INNOVATE (Project Code: T1EDK-01046).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to I. Maglogiannis.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kallipolitis, A., Galliakis, M., Menychtas, A. et al. Affective analysis of patients in homecare video-assisted telemedicine using computational intelligence. Neural Comput & Applic 32, 17125–17136 (2020). https://doi.org/10.1007/s00521-020-05203-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05203-z

Keywords

Navigation