Skip to main content
Log in

Expressive motions recognition and analysis with learning and statistical methods

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This paper proposes to recognize and analyze expressive gestures using a descriptive motion language, the Laban Movement Analysis (LMA) method. We extract body features based on LMA factors which describe both quantitative and qualitative aspects of human movement. In the direction of our study, a dataset of 5 gestures performed with 4 emotions is created using the motion capture Xsens. We used two different approaches for emotions analysis and recognition. The first one is based on a machine learning method, the Random Decision Forest. The second approach is based on the human’s perception. We derive the most important features for each expressed emotion using the same methods, the RDF and the human’s ratings. We compared the results obtained from the automatic learning method against human perception in the discussion section.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Ajili I, Mallem M, Didier J-Y (2017) Robust human action recognition system using laban movement analysis. Procedia Comput Sci 112(Supplement C):554–563. Knowledge-Based and Intelligent Information & Engineering Systems: Proceedings of the 21st International Conference, KES-20176-8, September 2017, Marseille, France

    Article  Google Scholar 

  2. Ajili I, Mallem M, Didier JY (2017) Gesture recognition for humanoid robot teleoperation. In: 2017 26Th IEEE international symposium on robot and human interactive communication (RO-MAN), pp 1115–1120

  3. Amaya K, Bruderlin A, Calvert T (1996) Emotion from motion. In: Proceedings of the Graphics Interface 1996 Conference, May 22-24, 1996, Toronto, Ontario, Canada, pp 222-229. Canadian Human-Computer Communications Society

  4. Argyle M (1975) Bodily communication. Methuen Publishing Company, London

    Google Scholar 

  5. Aristidou A, Stavrakis E, Chrysanthou Y (2014) LMA-based motion retrieval for folk dance cultural heritage. Springer International Publishing, Cham, pp 207–216

    Google Scholar 

  6. Aristidou A, Stavrakis E, Papaefthimiou M, Papagiannakis G, Chrysanthou Y (2017) Style-based motion analysis for dance composition. The visual computer

  7. Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquit Comput 14(5):457–467

    Article  Google Scholar 

  8. Bradford Barber C, Dobkin DP, Huhdanpaa H (1996) The quickhull algorithm for convex hulls. ACM Trans Math Softw 22(4):469–483

    Article  MathSciNet  MATH  Google Scholar 

  9. Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140

    MATH  Google Scholar 

  10. Breiman L (2001) Random forests. Mach Learn 45(1):5–32

    Article  MATH  Google Scholar 

  11. Chen LS, Huang TS (2000) Emotional expressions in audiovisual human computer interaction. In: 2000 Proceedings IEEE International conference on multimedia and expo. ICME2000. Latest advances in the fast changing world of multimedia (cat. no.00TH8532), vol. 1, pp 423–426

  12. Chi D, Costa M, Zhao L, Badler N (2000) The emote model for effort and shape. In: Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’00, pp 173–182, New York, NY, USA. ACM Press/Addison-Wesley Publishing Co

  13. Cimen G, Ilhan H, Capin T, Gurcay H (2013) Classification of human motion based on affective state descriptors. Comput Anim Virtual Worlds 24(3-4):355–363

    Article  Google Scholar 

  14. de Gelder B (2009) Why bodies twelve reasons for including bodily expressions in affective neuroscience. Philos Trans R Soc, B 364(1535):3475–3484

    Article  Google Scholar 

  15. De Silva LC, Ng PC (2000) Bimodal emotion recognition. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 332–335

  16. Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Berlin, pp 1–15

  17. Durupinar F, Kapadia M, Deutsch S, Neff M, Badler NI (2016) Perform: Perceptual approach for adding ocean personality to human motion using laban movement analysis. ACM Trans Graph, 36(1)

  18. Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844

    Article  Google Scholar 

  19. Kamaruddin N, Wahab A (2010) Driver behavior analysis through speech emotion understanding. In: 2010 IEEE Intelligent vehicles symposium, pp 238–243

  20. Kanade T, Cohn JF, Tian Y (2000) Comprehensive database for facial expression analysis. In: Proceedings 4th IEEE international conference on automatic face and gesture recognition (Cat. No. PR00580), pp 46–53

  21. Kapadia M, Chiang I-k, Thomas T, Badler NI, Kider Jr. JT (2013) Efficient motion retrieval in large motion databases. In: Proceedings of the ACM SIGGRAPH symposium on interactive 3D graphics and games, I3D ’13, pp 19–28, New York, NY, USA. ACM

  22. Kleinsmith A, Bianchi-Berthouze N (2013) Affective body expression perception and recognition: a survey. IEEE Trans Affect Comput 4(1):15–33

    Article  Google Scholar 

  23. Kwon YH, da Vitoria Lobo N (1994) Age classification from facial images. In: 1994 Proceedings of IEEE conference on computer vision and pattern recognition, pp 762–767

  24. Lanitis A, Draganova C, Christodoulou C (2004) Comparing different classifiers for automatic age estimation. IEEE Trans Syst Man Cybern B Cybern 34(1):621–628

    Article  Google Scholar 

  25. Liu Y, Nie L, Han L, Zhang L, Rosenblum DS (2016) Action2activity: Recognizing complex activities from sensor data. CoRR, arXiv:1611.01872

  26. Liu Y, Nie L, Liu L, Rosenblum DS (2016) From action to activity: Sensor-based activity recognition. Neurocomputing 181:108–115. Big data driven intelligent transportation systems

    Article  Google Scholar 

  27. Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robot Auton Syst 58(12):1256–1265. Intelligent Robotics and Neuroscience

    Article  Google Scholar 

  28. Masuda M, Kato S (2010) Motion rendering system for emotion expression of human form robots based on laban movement analysis. In: 19Th international symposium in robot and human interactive communication, pp 324–329

  29. Montepare J, Koff E, Zaitchik D, Albert M (1999) The use of body movements and gestures as cues to emotions in younger and older adults. J Nonverbal Behav 6(2):133–152

    Article  Google Scholar 

  30. Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91(9):1370–1390

    Article  Google Scholar 

  31. Ramanathan N, Chellappa R (2006) Face verification across age progression. IEEE Trans Image Process 15(11):3349–3361

    Article  Google Scholar 

  32. Shafir T, Tsachor RP, Welch KB (2016) Emotion regulation through movement: Unique sets of movement characteristics are associated with and enhance basic emotions. Front Psychol 6:2030

    Article  Google Scholar 

  33. Tavakol M, Dennick R (2011) Making sense of cronbach’s alpha. Int J Med Educ 2:53–55

    Article  Google Scholar 

  34. von Laban R, Ullmann L (1980) The mastery of movement. Macdonald and Evans, Evans

    Google Scholar 

  35. Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28(6):879–896

    Article  Google Scholar 

  36. Xsens technologies. 2005-2014

  37. Zacharatos H, Gatzoulis C, Chrysanthou Y, Aristidou A (2013) Emotion recognition for exergames using laban movement analysis. In: Proceedings of Motion on Games, MIG ’13, pp 39:61–39:66, New York, NY, USA. ACM

  38. Zhao L, Badler NI (2005) Acquiring and validating motion qualities from live limb gestures. Graph Model 67(1):1–16

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank the staff of the University of Evry Val d’Essonne for participating in our dataset. As well, we would like to thank Mrs. Alice Jourlin for help with data gathering and tabulation. This work was partially supported by the Strategic Research Initiatives project iCODE accredited by University Paris Saclay.

Funding

This study was funded by the Strategic Research Initiatives project iCODE, University Paris Saclay.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Insaf Ajili.

Ethics declarations

Conflict of interests

Author Insaf Ajili declares that she has no conflict of interest. Author Zahra Ramezanpanah declares that she has no conflict of interest. Author Malik Mallem declares that he has no conflict of interest. Author Jean Yves Didier declares that he has no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ajili, I., Ramezanpanah, Z., Mallem, M. et al. Expressive motions recognition and analysis with learning and statistical methods. Multimed Tools Appl 78, 16575–16600 (2019). https://doi.org/10.1007/s11042-018-6893-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-6893-5

Keywords

Navigation