Skip to main content

Identification of Food Packaging Activity Using MoCap Sensor Data

  • Conference paper
  • First Online:
Sensor- and Video-Based Activity and Behavior Computing

Part of the book series: Smart Innovation, Systems and Technologies ((SIST,volume 291))

  • 201 Accesses

Abstract

The automation system has brought a revolutionary change in our lives. Food packaging activity recognition can add a new dimension to industrial automation systems. However, it is challenging to identify the packaging activities using only skeleton data of the upper body due to the similarities between the activities and subject-dependent results. Bento Packaging Activity Recognition Challenge 2021 provides us with a dataset of ten different activities performed during Bento box packaging in a laboratory using MoCap (motion capture) sensors. Bento box is a single-serving packed meal that is very popular in Japanese cuisine. In this paper, we develop methods using the classical machine learning approach, as the given dataset is small compared to other skeleton datasets. After preprocessing, we extract different hand-crafted features and train different models like extremely randomized trees, random forest, and XGBoost classifiers and select the best model based on cross-validation score. Then, we explore different combinations of features and use the best combination of features for prediction. By applying our methodology, we achieve 64% accuracy and 53.66% average accuracy in tenfold cross-validation and leave-one-subject-out cross-validation, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kim, E., Helal, S., Cook, D.: Human activity recognition and pattern discovery. IEEE Pervasive Comput. 9, 48–53 (2009)

    Article  Google Scholar 

  2. Chen, L., Hoey, J., Nugent, C., Cook, D., Yu, Z.: Sensor-based activity recognition. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 42, 790–808 (2012)

    Google Scholar 

  3. Bodor, R., Jackson, B., Papanikolopoulos, N.: Vision-based human tracking and activity recognition. In: Proceedings of the 11th Mediterranean Conference on Control and Automation, vol. 1 (2003)

    Google Scholar 

  4. Bux, A., Angelov, P., Habib, Z.: Vision based human activity recognition: a review. In: Advances In Computational Intelligence Systems, pp. 341–371 (2017)

    Google Scholar 

  5. Zhang, Z.: Microsoft kinect sensor and its effect. IEEE Multimedia 19, 4–10 (2012)

    Article  Google Scholar 

  6. Mitobe, K., Kaiga, T., Yukawa, T., Miura, T., Tamamoto, H., Rodgers, A., Yoshimura, N.: Development of a motion capture system for a hand using a magnetic three dimensional position sensor. In: ACM SIGGRAPH 2006 Research Posters, pp. 102-es (2006)

    Google Scholar 

  7. Sarker, S., Rahman, S., Hossain, T., Ahmed, S., Jamal, L., Ahad, M.A.R.: Skeleton-based activity recognition: preprocessing and approaches. Contactless Human Act. Anal. 200, 43 (2021)

    Article  Google Scholar 

  8. Hbali, Y., Hbali, S., Ballihi, L., Sadgal, M.: Skeleton-based human activity recognition for elderly monitoring systems. IET Comput. Vis. 12, 16–26 (2018)

    Article  Google Scholar 

  9. Slim, S., Atia, A., Elfattah, M., Mostafa, M.: Survey on human activity recognition based on acceleration data. Intl. J. Adv. Comput. Sci. Appl. 10, 84–98 (2019)

    Google Scholar 

  10. Maurtua, I., Ibarguren, A., Kildal, J., Susperregi, L., Sierra, B.: Human-robot collaboration in industrial applications: safety, interaction and trust. Int. J. Adv. Robot. Syst. 14, 1729881417716010 (2017)

    Google Scholar 

  11. Mesquita, A., Zamarioli, C., Carvalho, E.: The use of robots in nursing care practices: an exploratory descriptive study. Online Braz. J. Nurs. 15, 404–413 (2016)

    Article  Google Scholar 

  12. Hentout, A., Aouache, M., Maoudj, A., Akli, I.: Human-robot interaction in industrial collaborative robotics: a literature review of the decade 2008–2017. Adv. Robot. 33, 764–799 (2019)

    Article  Google Scholar 

  13. Alia, S., Lago, P., Takeda, S., Adachi, K., Benaissa, B., Ahad, M.A.R., Inoue, S.: Summary of the cooking activity recognition challenge. In: Human Activity Recognition Challenge, pp. 1–13 (2021)

    Google Scholar 

  14. Lago, P., Alia, S., Takeda, S., Mairittha, T., Mairittha, N., Faiz, F., Nishimura, Y., Adachi, K., Okita, T., Charpillet, F., et al.: Nurse care activity recognition challenge: summary and results. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 746–751 (2019)

    Google Scholar 

  15. Siraj, M., Shahid, O., Ahad, M.A.R.: Cooking activity recognition with varying sampling rates using deep convolutional GRU framework. In: Human Activity Recognition Challenge, pp. 115–126 (2021)

    Google Scholar 

  16. Basak, P., Tasin, S., Tapotee, M., Sheikh, M., Sakib, A., Baray, S., Ahad, M.A.R.: Complex nurse care activity recognition using statistical features. In: Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, pp. 384–389 (2020). https://doi.org/10.1145/3410530.3414338

  17. Alia, S.S., Adachi, K., Nahid, N., Kaneko, H., Lago, P., Bento, S.I.: Packaging Activity Recognition Challenge. IEEE DataPort (2021). https://doi.org/10.21227/cwhs-t440

  18. Brownlee, J.: A gentle introduction to imbalanced classification. Mach. Learn. Mastery 22 (2019)

    Google Scholar 

  19. Bach, M., Werner, A., Palt, M.: The proposal of under sampling method for learning from imbalanced datasets. Procedia Comput. Sci. 159, 125–134 (2019)

    Article  Google Scholar 

  20. Kadir, M., Akash, P., Sharmin, S., Ali, A., Shoyaib, M.: Can a simple approach identify complex nurse care activity? In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, pp. 736–740 (2019)

    Google Scholar 

  21. Tits, M.: Expert gesture analysis through motion capture using statistical modeling and machine learning. Ph.D. Dissertation (2018)

    Google Scholar 

  22. Sie, M., Cheng, Y., Chiang, C.: Key motion spotting in continuous motion sequences using motion sensing devices. In: 2014 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), pp. 326–331 (2014)

    Google Scholar 

  23. Ahad, M.A.R., Lago, P., Inoue, S.: Human Activity Recognition Challenge. Springer, Berlin (2021)

    Google Scholar 

  24. Geurts, P., Ernst, D., Wehenkel, L.: Extremely randomized trees. Mach. Learn. 63, 3–42 (2006)

    Article  Google Scholar 

  25. Oshiro, T., Perez, P., Baranauskas, J.: How many trees in a random forest? In: International Workshop on Machine Learning and Data Mining in Pattern Recognition, pp. 154–168 (2012)

    Google Scholar 

  26. Chen, T., Guestrin, C., et al.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD’16), pp. 785–794 (2016)

    Google Scholar 

  27. Yu, B., Liu, Y., Chan, K.: Effective human activity recognition based on small datasets (2020). ArXiv Preprint ArXiv:2004.13977

  28. Hossain, T., Sarker, S., Rahman, S., Ahad, M.A.R.: Skeleton-based human action recognition on large-scale datasets. In: Vision, Sensing and Analytics: Integrative Approaches, pp. 125–146 (2021)

    Google Scholar 

  29. Adachi, K., Alia, S.S., Nahid, N., Kaneko, H., Lago, P., Inoue, S.: Summary of the Bento packaging activity recognition challenge. In: The 3rd International Conference on Activity and Behavior Computing (ABC2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Md Atiqur Rahman Ahad .

Editor information

Editors and Affiliations

Appendix

Appendix

See Table 3.

Table 3 The summary of the resources used

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Anwar, A., Islam Tapotee, M., Saha, P., Ahad, M.A.R. (2022). Identification of Food Packaging Activity Using MoCap Sensor Data. In: Ahad, M.A.R., Inoue, S., Roggen, D., Fujinami, K. (eds) Sensor- and Video-Based Activity and Behavior Computing. Smart Innovation, Systems and Technologies, vol 291. Springer, Singapore. https://doi.org/10.1007/978-981-19-0361-8_11

Download citation

Publish with us

Policies and ethics