Accurate metaheuristic deep convolutional structure for a robust human gait recognition

Reem Nehad Yousef, Abeer Tawkool Khalil, Ahmed Shaaban Samra, Mohamed Maher Ata

Abstract


Gait recognition has become a developing technology in various security, industrial, medical, and military applications. This paper proposed a deep convolutional neural network (CNN) model to authenticate humans via their walking style. The proposed model has been applied to two commonly used standardized datasets, Chinese Academy of Sciences (CASIA) and Osaka University-Institute of Scientific and Industrial Research (OU-ISIR). After the silhouette images have been isolated from the gait image datasets, their features have been extracted using the proposed deep CNN and the traditional ones, including AlexNet, Inception (GoogleNet), VGGNet, ResNet50, and Xception. The best features were selected using genetic, grey wolf optimizer (GWO), particle swarm optimizer (PSO), and chi-square algorithms. Finally, recognize the selected features using the proposed deep neural network (DNN). Several performance evaluation parameters have been estimated to evaluate the model’s quality, including accuracy, specificity, sensitivity, false negative rate (FNR), and training time. Experiments have demonstrated that the suggested framework with a genetic feature selector outperforms previous selectors and recent research, scoring accuracy values of 99.46% and 99.09% for evaluating the CASIA and OU-ISIR datasets, respectively, in low time (19 seconds for training).

Keywords


CASIA gait dataset; convolutional neural network; gait recognition; genetic algorithm; OU-ISIR

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v13i6.pp7005-7015

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).