Skip to main content
Log in

Human activity recognition-based path planning for autonomous vehicles

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Human activity recognition (HAR) is a wide research topic in a field of computer science. Improving HAR can lead to massive breakthrough in humanoid robotics, robots used in medicine and in the field of autonomous vehicles. The system that is able to recognise human and its activity without any errors and anomalies would lead to safer and more empathetic autonomous systems. During this research work, multiple neural networks models, with different complexity, are being investigated. Each model is re-trained on the proposed unique data set, gathered on automated guided vehicle (AGV) with the latest and the modest sensors used commonly on autonomous vehicles. The best model is picked out based on the final accuracy for action recognition. Best models pipeline is fused with YOLOv3, to enhance the human detection. In addition to pipeline improvement, multiple action direction estimation methods are proposed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Moencks, M., De Silva, V., Roche, J., Kondoz, A: Adaptive feature processing for robust human activity recognition on a novel multi-modal dataset. ArXiv preprint arXiv:1901.02858 (2019)

  2. Koopman, P., Wagner, M.: Challenges in autonomous vehicle testing and validation. SAE Int. J. Transp. Saf. 4(1), 15–24 (2016)

    Article  Google Scholar 

  3. Mordue, G., Yeung, A., Wu, F.: The looming challenges of regulating high level autonomous vehicles. Transp. Res. Part A Policy Pract. 132, 174–187 (2020)

    Article  Google Scholar 

  4. Jordao, A., Torres, L.A.B., Schwartz, W.R.: Novel approaches to human activity recognition based on accelerometer data. Signal Image Video Process. 12(7), 1387–1394 (2018)

    Article  Google Scholar 

  5. Sapiński, T., Kamińska, D., Pelikant, A., Anbarjafari, G.: Emotion recognition from skeletal movements. Entropy 21(7), 646 (2019)

    Article  Google Scholar 

  6. Nan, M., Ghită, A.S., Gavril, A.-F., Trascau, M., Sorici, A., Cramariuc, B., Florea, A.M.: Human action recognition for social robots. In: 2019 22nd International Conference on Control Systems and Computer Science (CSCS). IEEE, pp. 675–681 (2019)

  7. Batziakoudi, K., Griva, A., Karagiannaki, A., Pramatari, K.: Human computer interaction in business analytics: the case of a retail analytics platform. In: ECIS (2020)

  8. Olatunji, I.: Human activity recognition for mobile robot. J. Phys. Conf. Ser. 1069, 01 (2018)

    Article  Google Scholar 

  9. Dua, D., Graff, C.: UCI machine learning repository. http://archive.ics.uci.edu/ml. Accessed 05 April 2020

  10. Uzunovic, T., Golubovic, E., Tucakovic, Z., Acikmese, Y., Sabanovic, A.: Task-based control and human activity recognition for human-robot collaboration. In: IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society. IEEE, pp. 5110–5115 (2018)

  11. Shukri, S., Kamarudin, L.M., Rahiman, M.H.F.: Device-free localization for human activity monitoring. In: Intelligent Video Surveillance. IntechOpen (2018)

  12. Jordao, A., Nazare Jr, A.C., Sena, J., Schwartz, W.R.: Human activity recognition based on wearable sensor data: a standardization of the state-of-the-art (2018)

  13. Openimage dataset. https://storage.googleapis.com/openimages/web/index.html. Accessed 19 March 2020

  14. Kaggle dataset on human. https://www.kaggle.com/dasmehdixtr/human-action-recognition-dataset. Accessed 19 March 2020

  15. Vrigkas, M., Nikou, C., Kakadiaris, I.A.: A review of human activity recognition methods. Front. Robot. AI 2, 28 (2015)

    Article  Google Scholar 

  16. Lstm har. https://github.com/eriklindernoren/Action-Recognition. Accessed 22 Jan 2020

  17. Ufc-101 dataset. https://www.crcv.ucf.edu/data/UCF101.php. Accessed 22 Jan 2020

  18. Action recognision based on human skeleton. https://github.com/smellslikeml/ActionAI. Accessed 21 Jan 2020

  19. Action recognision based on human skeleton. https://github.com/NVIDIA-AI-IOT/trt_pose. Accessed 22 Jan 2020

  20. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition (2015)

  21. Action recognition based on human skeleton. https://tinyurl.com/y4e74b9v. Accessed 04 Jan 2020

  22. Cao, Z., Simon, T., Wei, S.-E., Sheikh, Y.: Realtime multi-person 2D pose estimation using part affinity fields. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7291–7299 (2017)

  23. Qiao, S., Wang, Y., Li, J.: Real-time human gesture grading based on openpose. In: 2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), pp. 1–6 (2017)

  24. FFmpeg. https://www.ffmpeg.org/. Accessed 12 March 2020

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gholamreza Anbarjafari.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work has been partially supported by the Estonian Centre of Excellence in IT (EXCITE) funded by the European Regional Development Fund. The authors also gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan X Pascal GPU.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tammvee, M., Anbarjafari, G. Human activity recognition-based path planning for autonomous vehicles. SIViP 15, 809–816 (2021). https://doi.org/10.1007/s11760-020-01800-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-020-01800-6

Keywords

Navigation