Abstract
Feature Selection (FS) is an important data management technique that aims to minimize redundant information in a dataset. This work proposes DENGO, an improved version of the Northern Goshawk Optimization (NGO), to address the FS problem. The NGO is an efficient swarm-based algorithm that takes its inspiration from the predatory actions of the northern goshawk. In order to overcome the disadvantages that NGO is prone to local optimum trap, slow convergence speed and low convergence accuracy, two strategies are introduced in the original NGO to boost the effectiveness of NGO. Firstly, a learning strategy is proposed where search members learn by learning from the information gaps of other members of the population to enhance the algorithm's global search ability while improving the population diversity. Secondly, a hybrid differential strategy is proposed to improve the capability of the algorithm to escape from the trap of the local optimum by perturbing the individuals to improve convergence accuracy and speed. To prove the effectiveness of the suggested DENGO, it is measured against eleven advanced algorithms on the CEC2015 and CEC2017 benchmark functions, and the obtained results demonstrate that the DENGO has a stronger global exploration capability with higher convergence performance and stability. Subsequently, the proposed DENGO is used for FS, and the 29 benchmark datasets from the UCL database prove that the DENGO-based FS method equipped with higher classification accuracy and stability compared with eight other popular FS methods, and therefore, DENGO is considered to be one of the most prospective FS techniques. DENGO's code can be obtained at https://www.mathworks.com/matlabcentral/fileexchange/158811-project1.
Similar content being viewed by others
Data Availability
The data that support the findings of this study are available from the corresponding author.
References
Zawbaa, H. M., Emary, E., Grosan, C., & Snasel, V. (2018). Large-dimensionality small-instance set feature selection: A hybrid bio-inspired heuristic approach. Swarm and Evolutionary Computation, 42, 29–42.
Manbari, Z., AkhlaghianTab, F., & Salavati, C. (2019). Hybrid fast unsupervised feature selection for high-dimensional data. Expert Systems with Applications, 124, 97–118.
Albukhanajer, W. A., Briffa, J. A., & Jin, Y. (2014). Evolutionary multiobjective image feature extraction in the presence of noise. IEEE Transactions on Cybernetics, 45(9), 1757–1768.
Xue, B., Zhang, M., Browne, W. N., & Yao, X. (2015). A survey on evolutionary computation approaches to feature selection. IEEE Transactions on evolutionary computation, 20(4), 606–626.
Tubishat, M., Ja’afar, S., Alswaitti, M., Mirjalili, S., Idris, N., Ismail, M. A., & Omar, M. S. (2021). Dynamic salp swarm algorithm for feature selection. Expert Systems with Applications, 164, 113873.
Kamath, U., De Jong, K., & Shehu, A. (2014). Effective automated feature construction and selection for classification of biological sequences. PLoS ONE, 9(7), e99982.
Crone, S. F., & Kourentzes, N. (2010). Feature selection for time series prediction—A combined filter and wrapper approach for neural networks. Neurocomputing, 73(10–12), 1923–1936.
Hu, Z., Bao, Y., Xiong, T., & Chiong, R. (2015). Hybrid filter–wrapper feature selection for short-term load forecasting. Engineering Applications of Artificial Intelligence, 40, 17–27.
Wang, A., An, N., Chen, G., Li, L., & Alterovitz, G. (2015). Accelerating wrapper-based feature selection with K-nearest-neighbor. Knowledge-Based Systems, 83, 81–91.
Jiménez-Cordero, A., Morales, J. M., & Pineda, S. (2021). A novel embedded min–max approach for feature selection in nonlinear support vector machine classification. European Journal of Operational Research, 293(1), 24–35.
Nemnes, G. A., Filipoiu, N., & Sipica, V. (2021). Feature selection procedures for combined density functional theory—artificial neural network schemes. Physica Scripta, 96(6), 065807.
Xue, Y., Tang, Y., Xu, X., Liang, J., & Neri, F. (2021). Multi-objective feature selection with missing data in classification. IEEE Transactions on Emerging Topics in Computational Intelligence, 6(2), 355–364.
Xue, Y., Tang, T., & Liu, A. X. (2019). Large-scale feedforward neural network optimization by a self-adaptive strategy and parameter based particle swarm optimization. IEEE Access, 7, 52473–52483.
Aličković, E., & Subasi, A. (2017). Breast cancer diagnosis using GA feature selection and rotation forest. Neural Computing and Applications, 28, 753–763.
Too, J., & Abdullah, A. R. (2021). A new and fast rival genetic algorithm for feature selection. The Journal of Supercomputing, 77, 2844–2874.
Zhang, Y., Liu, R., Wang, X., Chen, H., & Li, C. (2021). Boosted binary Harris Hawks optimizer and feature selection. Engineering with Computers, 37, 3741–3770.
Sun, L., Si, S., Zhao, J., Xu, J., Lin, Y., & Lv, Z. (2023). Feature selection using binary monarch butterfly optimization. Applied Intelligence, 53(1), 706–727.
Houssein, E. H., Oliva, D., Celik, E., Emam, M. M., & Ghoniem, R. M. (2023). Boosted sooty tern optimization algorithm for global optimization and feature selection. Expert Systems with Applications, 213, 119015.
Xu, Z., Heidari, A. A., Kuang, F., Khalil, A., Mafarja, M., Zhang, S., & Pan, Z. (2023). Enhanced Gaussian bare-bones grasshopper optimization: Mitigating the performance concerns for feature selection. Expert Systems with Applications, 212, 118642.
Abualigah, L., & Diabat, A. (2022). Chaotic binary group search optimizer for feature selection. Expert Systems with Applications, 192, 116368.
Taradeh, M., & Mafarja, M. (2020). Binary thermal exchange optimization for feature selection. Data management and analysis: case studies in education, healthcare and beyond (pp. 239–260). Cham: Springer.
Yao, L., Yuan, P., Tsai, C. Y., Zhang, T., Lu, Y., & Ding, S. (2023). ESO: An enhanced snake optimizer for real-world engineering problems. Expert Systems with Applications, 230, 120594.
Dehghani, M., Hubálovský, Š, & Trojovský, P. (2021). Northern goshawk optimization: A new swarm-based algorithm for solving optimization problems. IEEE Access, 9, 162059–162080.
El-Dabah, M. A., El-Sehiemy, R. A., Hasanien, H. M., & Saad, B. (2023). Photovoltaic model parameters identification using northern goshawk optimization algorithm. Energy, 262, 125522.
Liang, Y., Hu, X., Hu, G., & Dou, W. (2022). An enhanced northern goshawk optimization algorithm and its application in practical optimization problems. Mathematics, 10(22), 4383.
Wang, J., Xiang, Z., Cheng, X., Zhou, J., & Li, W. (2023). Tool wear state identification based on SVM optimized by the improved northern goshawk optimization. Sensors, 23(20), 8591.
Youssef, H., Kamel, S., Hassan, M. H., Yu, J., & Safaraliev, M. (2024). A smart home energy management approach incorporating an enhanced northern goshawk optimizer to enhance user comfort, minimize costs, and promote efficient energy consumption. International Journal of Hydrogen Energy, 49, 644–658.
Dokeroglu, T., Deniz, A., & Kiziloz, H. E. (2022). A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing, 494, 269–296.
Ewees, A. A., Mostafa, R. R., Ghoniem, R. M., & Gaheen, M. A. (2022). Improved seagull optimization algorithm using Lévy flight and mutation operator for feature selection. Neural Computing and Applications, 34(10), 7437–7472.
Ibrahim, R. A., Ewees, A. A., Oliva, D., Abd Elaziz, M., & Lu, S. (2019). Improved SALP swarm algorithm based on particle swarm optimization for feature selection. Journal of Ambient Intelligence and Humanized Computing, 10, 3155–3169.
Zhang, Q., Gao, H., Zhan, Z. H., Li, J., & Zhang, H. (2023). Growth optimizer: A powerful metaheuristic algorithm for solving continuous and discrete global optimization problems. Knowledge-Based Systems, 261, 110206.
Abdel-Basset, M., Mohamed, R., Jameel, M., & Abouhawwash, M. (2023). Nutcracker optimizer: A novel nature-inspired metaheuristic algorithm for global optimization and engineering design problems. Knowledge-Based Systems, 262, 110248.
Abdel-Basset, M., Mohamed, R., Jameel, M., & Abouhawwash, M. (2023). Spider wasp optimizer: A novel meta-heuristic optimization algorithm. Artificial Intelligence Review, 56(10), 11675–11738.
Ahmadianfar, I., Heidari, A. A., Noshadian, S., Chen, H., & Gandomi, A. H. (2022). INFO: An efficient optimization algorithm based on weighted mean of vectors. Expert Systems with Applications, 195, 116516.
Cao, Z., Jia, H., Wang, Z., Foh, C. H., & Tian, F. (2024). A differential evolution with autonomous strategy selection and its application in remote sensing image denoising. Expert Systems with Applications, 238, 122108.
Altay, O. (2022). Chaotic slime mould optimization algorithm for global optimization. Artificial Intelligence Review, 55(5), 3979–4040.
Zhao, W., Zhang, Z., & Wang, L. (2020). Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Engineering Applications of Artificial Intelligence, 87, 103300.
Hashim, F. A., & Hussien, A. G. (2022). Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowledge-Based Systems, 242, 108320.
Zhao, W., Wang, L., & Mirjalili, S. (2022). Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Computer Methods in Applied Mechanics and Engineering, 388, 114194.
Abdollahzadeh, B., SoleimanianGharehchopogh, F., & Mirjalili, S. (2021). Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. International Journal of Intelligent Systems, 36(10), 5887–5958.
Zhao, W., Wang, L., & Zhang, Z. (2020). Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural Computing and Applications, 32, 9383–9425.
Rao, R. V., Savsani, V. J., & Vakharia, D. P. (2011). Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Computer-Aided Design, 43(3), 303–315.
Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of ICNN'95-international conference on neural networks (Vol. 4, pp. 1942–1948). IEEE.
Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja, M., & Chen, H. (2019). Harris Hawks optimization: Algorithm and applications. Future Generation Computer Systems, 97, 849–872.
Arora, S., & Singh, S. (2019). Butterfly optimization algorithm: A novel approach for global optimization. Soft Computing, 23, 715–734.
Mirjalili, S., & Lewis, A. (2016). The whale optimization algorithm. Advances in Engineering Software, 95, 51–67.
Qi, X., Zhu, Y., & Zhang, H. (2017). A new meta-heuristic butterfly-inspired algorithm. Journal of Computational Science, 23, 226–239.
Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf optimizer. Advances in Engineering Software, 69, 46–61.
Faramarzi, A., Heidarinejad, M., Stephens, B., & Mirjalili, S. (2020). Equilibrium optimizer: A novel optimization algorithm. Knowledge-Based Systems, 191, 105190.
Abdel-Basset, M., Mohamed, R., Zidan, M., Jameel, M., & Abouhawwash, M. (2023). Mantis search algorithm: A novel bio-inspired algorithm for global optimization and engineering design problems. Computer Methods in Applied Mechanics and Engineering, 415, 116200.
Abdollahzadeh, B., Gharehchopogh, F. S., & Mirjalili, S. (2021). African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Computers & Industrial Engineering, 158, 107408.
Trojovský, P., & Dehghani, M. (2022). Pelican optimization algorithm: A novel nature-inspired algorithm for engineering applications. Sensors, 22(3), 855.
Funding
This work is supported in part by the National Natural Science Foundation of China's top-level program under grant No. 52275480, in part by Reserve projects for centralized guidance of local science and technology development funds under grant No. QKHZYD [2023]002.
Author information
Authors and Affiliations
Contributions
Conceptualization, R.X. and F.W.; methodology, R.X. and F.W.; software, R.X.; validation, R.X.; formal analysis, R.X. and F.W.; investigation, R.X.; resources, S.L.; writing original draft preparation, R.X.; writing review and editing, R.X.; supervision, S.L. and F.W.; project administration, S.L.; funding acquisition, S.L. All authors have read and agreed to the published version of the manuscript.
Corresponding author
Ethics declarations
Conflict of Interest
We would like to submit our manuscript entitled “An Improved Northern Goshawk Optimization Algorithm for Feature Selection” to “Journal of Bionic Engineering” for publication. No conflict of interest exists in the submission of this manuscript.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Xie, R., Li, S. & Wu, F. An Improved Northern Goshawk Optimization Algorithm for Feature Selection. J Bionic Eng (2024). https://doi.org/10.1007/s42235-024-00515-5
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42235-024-00515-5