Skip to main content
Log in

Transition activity recognition using fuzzy logic and overlapped sliding window-based convolutional neural networks

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

In this paper, we propose a novel approach that can recognize transition activities (e.g., turn to left or right, stand up, and travel down the stairs). Unlike simple activities, the transition activities have unique characteristics that change continuously and occur instantaneously. To recognize the transition activities with these characteristics, we applied convolutional neural network (CNN) that is widely adopted to recognize images, voices, and human activities. In addition, to generate input instances for the CNN model, we developed the overlapped sliding window method, which can accurately recognize the transition activities occurring during a short time. To increase the accuracy of the activity recognition, we have learned CNN models by separating the simple activity and the transition activity. Finally, we adopt fuzzy logic that can be used to handle ambiguous activities. All the procedures of recognizing the elderly’s activities are performed using the data collected by the six sensors embedded in the smartphone. The effectiveness of the proposed approach is shown through experiments. We demonstrate that our approach can improve recognition accuracy of transition activities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Aggarwal JK, Ryoo MS (2011) Human activity analysis: a review. ACM Comput Surv (CSUR) 43(3):16

    Article  Google Scholar 

  2. Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL (2012) Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. In: International Workshop on Ambient Assisted Living, pp 216–223

  3. Ann OC, Theng LB (2014) Human activity recognition: a review. In: 2014 IEEE International Conference on Control System, Computing and Engineering (ICCSCE), pp 389–393

  4. Attal F, Mohammed S, Dedabrishvili M, Chamroukhi F, Oukhellou L, Amirat Y (2015) Physical human activity recognition using wearable sensors. Sensors 15(12):31314–31338

    Article  Google Scholar 

  5. Baccouche M, Mamalet F, Wolf C, Garcia C, Baskurt A (2011) Sequential deep learning for human action recognition. In: International Workshop on Human Behavior Understanding, pp 29–39

  6. Bayat A, Pomplun M, Tran DA (2014) A study on human activity recognition using accelerometer data from smartphones. Proc Comput Sci 34:450–457

    Article  Google Scholar 

  7. Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ (2012) Simple and complex activity recognition through smart phones. In: 2012 8th International Conference on Intelligent Environments (IE), pp 214–221

  8. Dietterich TG (2002) Machine learning for sequential data: a review. In: Joint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR), pp 15–30

  9. Fong S, Liu K, Cho K, Wong R, Mohammed S, Fiaidhi J (2016) Improvised methods for tackling big data stream mining challenges: case study of human activity recognition. J Supercomput 72(10):3927–3959

    Article  Google Scholar 

  10. Ghosh A, Riccardi G (2014) Recognizing human activities from smartphone sensor signals. In: Proceedings of the 22nd ACM International Conference on Multimedia, pp 865–868

  11. Goffe WL, Ferrier GD, Rogers J (1994) Global optimization of statistical functions with simulated annealing. J Econom 60(1–2):65–99

    Article  MATH  Google Scholar 

  12. Guan D, Yuan W, Lee YK, Gavrilov A, Lee S (2007) Activity recognition based on semi-supervised learning. In: 13th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications, 2007. RTCSA 2007, pp 469–475

  13. Hammerla NY, Halloran S, Ploetz T (2016) Deep, convolutional, and recurrent models for human activity recognition using wearables. arXiv:1604.08880

  14. Huynh, T, Schiele, B (2005) Analyzing features for activity recognition. In: Proceedings of the 2005 Joint Conference on Smart Objects and Ambient Intelligence: Innovative Context-Aware Services: Usages and Technologies, pp 159–163

  15. Jiang W, Yin Z (2015) Human activity recognition using wearable sensors by deep convolutional neural networks. In: Proceedings of the 23rd ACM International Conference on Multimedia, pp 1307–1310

  16. Kalita S, Karmakar A, Hazarika SM (2018) Efficient extraction of spatial relations for extended objects vis-à-vis human activity recognition in video. Appl Intell 48:204–219

    Article  Google Scholar 

  17. Kang J, Kim J, Lee S, Sohn M (2017) Recognition of transition activities of human using CNN-based on overlapped sliding window. In: The 5th International Conference on Big Data Applications and Services (BIGDAS), pp 143–148

  18. Kavitha R (2017) Human activity recognition from sensor data using random forest algorithm. Int J Adv Res Comput Sci 8(3):334–337

    MathSciNet  Google Scholar 

  19. Khan AM, Lee YK, Lee SY, Kim TS (2010) A triaxial accelerometer-based physical-activity recognition via augmented-signal features and a hierarchical recognizer. IEEE Trans Inf Technol Biomed 14(5):1166–1172

    Article  Google Scholar 

  20. Lee SM, Yoon SM, Cho H (2017) Human activity recognition from accelerometer data using convolutional neural network. In: 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), pp 131–134

  21. Lester J, Choudhury T, Kern N, Borriello G, Hannaford B (2005) A hybrid discriminative/generative approach for modeling human activities. In: IJCAI International Joint Conference on Artificial Intelligence (pp 766–772)

  22. Lyu L, He X, Law YW, Palaniswami M (2017) Privacy-preserving collaborative deep learning with application to human activity recognition. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp 1219–1228

  23. Núñez JC, Cabido R, Pantrigo JJ, Montemayor AS, Vélez JF (2018) Convolutional neural networks and long short-term memory for skeleton-based human activity and hand gesture recognition. Pattern Recogn 76:80–94

    Article  Google Scholar 

  24. Reyes-Ortiz JL, Oneto L, Samà A, Parra X, Anguita D (2016) Transition-aware human activity recognition using smartphones. Neurocomputing 171:754–767

    Article  Google Scholar 

  25. Rodríguez ND, Cuéllar MP, Lilius J, Calvo-Flores MD (2014) A fuzzy ontology for semantic modelling and recognition of human behaviour. Knowl-Based Syst 66:46–60

    Article  Google Scholar 

  26. Ronao CA, Cho SB (2014) Human activity recognition using smartphone sensors with two-stage continuous hidden Markov models. In: 2014 10th International Conference on Natural Computation (ICNC), pp 681–686

  27. Roy N, Misra A, Cook D (2016) Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments. J Ambient Intell Humaniz Comput 7(1):1–19

    Article  Google Scholar 

  28. Sheng M, Jiang J, Su B, Tang Q, Yahya AA, Wang G (2016) Short-time activity recognition with wearable sensors using convolutional neural network. In: Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry-Volume 1, pp 413–416

  29. Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A et al (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56(1):116–124

    Article  Google Scholar 

  30. Song JJ, Lee W (2017) Relevance maximization for high-recall retrieval problem: finding all needles in a haystack. J Supercomput. https://doi.org/10.1007/s11227-016-1956-8

    Article  Google Scholar 

  31. Sousa W, Souto E, Rodrigres J, Sadarc P, Jalali R, El-Khatib K (2017) A comparative analysis of the impact of features on human activity recognition with smartphone sensors. In: Proceedings of the 23rd Brazilian Symposium on Multimedia and the Web, pp 397–404

  32. Tang W, Sazonov ES (2014) Highly accurate recognition of human postures and activities through classification with rejection. IEEE J Biomed Health Inform 18(1):309–315

    Article  Google Scholar 

  33. Vishwakarma DK, Kapoor R (2015) Hybrid classifier based human activity recognition using the silhouette and cells. Expert Syst Appl 42(20):6957–6965

    Article  Google Scholar 

  34. Vishwakarma DK, Kapoor R, Maheshwari R, Kapoor V, Raman S (2015) Recognition of abnormal human activity using the changes in orientation of silhouette in key frames. In: 2015 2nd International Conference on Computing for Sustainable Global Development (INDIACom), pp 336–341

  35. Yang J, Lee J, Choi J (2011) Activity recognition based on RFID object usage for smart mobile devices. J Comput Sci Technol 26(2):239–246

    Article  Google Scholar 

  36. Yazdansepas D, Niazi AH, Gay JL, Maier FW, Ramaswamy L, Rasheed K, Buman MP (2016) A multi-featured approach for wearable sensor-based human activity recognition. In: 2016 IEEE International Conference on Healthcare Informatics (ICHI), pp 423–431

  37. Zebin T, Scully PJ, Ozanyan KB (2016) Human activity recognition with inertial sensors using a deep learning approach. In: SENSORS, 2016. IEEE, pp 1–3

  38. Zeng M, Nguyen LT, Yu B, Mengshoel OJ, Zhu J, Wu P, Zhang J (2014) Convolutional neural networks for human activity recognition using mobile sensors. In: 2014 6th International Conference on Mobile Computing, Applications and Services (MobiCASE), pp 197–205

Download references

Acknowledgements

This research was partially supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (NRF-2016 R1D1A1B03932110) and partially supported by the IT R&D program of KEIT (No. 1005-0810, Development of Disability Independent Accessibility Enhancement Technology for Input and Abnormality of Home Appliances).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mye Sohn.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kang, J., Kim, J., Lee, S. et al. Transition activity recognition using fuzzy logic and overlapped sliding window-based convolutional neural networks. J Supercomput 76, 8003–8020 (2020). https://doi.org/10.1007/s11227-018-2470-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-018-2470-y

Keywords

Navigation