Skip to main content
Log in

An Improved Northern Goshawk Optimization Algorithm for Feature Selection

  • Research Article
  • Published:
Journal of Bionic Engineering Aims and scope Submit manuscript

Abstract

Feature Selection (FS) is an important data management technique that aims to minimize redundant information in a dataset. This work proposes DENGO, an improved version of the Northern Goshawk Optimization (NGO), to address the FS problem. The NGO is an efficient swarm-based algorithm that takes its inspiration from the predatory actions of the northern goshawk. In order to overcome the disadvantages that NGO is prone to local optimum trap, slow convergence speed and low convergence accuracy, two strategies are introduced in the original NGO to boost the effectiveness of NGO. Firstly, a learning strategy is proposed where search members learn by learning from the information gaps of other members of the population to enhance the algorithm's global search ability while improving the population diversity. Secondly, a hybrid differential strategy is proposed to improve the capability of the algorithm to escape from the trap of the local optimum by perturbing the individuals to improve convergence accuracy and speed. To prove the effectiveness of the suggested DENGO, it is measured against eleven advanced algorithms on the CEC2015 and CEC2017 benchmark functions, and the obtained results demonstrate that the DENGO has a stronger global exploration capability with higher convergence performance and stability. Subsequently, the proposed DENGO is used for FS, and the 29 benchmark datasets from the UCL database prove that the DENGO-based FS method equipped with higher classification accuracy and stability compared with eight other popular FS methods, and therefore, DENGO is considered to be one of the most prospective FS techniques. DENGO's code can be obtained at https://www.mathworks.com/matlabcentral/fileexchange/158811-project1.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

Data Availability

The data that support the findings of this study are available from the corresponding author.

References

  1. Zawbaa, H. M., Emary, E., Grosan, C., & Snasel, V. (2018). Large-dimensionality small-instance set feature selection: A hybrid bio-inspired heuristic approach. Swarm and Evolutionary Computation, 42, 29–42.

    Article  Google Scholar 

  2. Manbari, Z., AkhlaghianTab, F., & Salavati, C. (2019). Hybrid fast unsupervised feature selection for high-dimensional data. Expert Systems with Applications, 124, 97–118.

    Article  Google Scholar 

  3. Albukhanajer, W. A., Briffa, J. A., & Jin, Y. (2014). Evolutionary multiobjective image feature extraction in the presence of noise. IEEE Transactions on Cybernetics, 45(9), 1757–1768.

    Article  Google Scholar 

  4. Xue, B., Zhang, M., Browne, W. N., & Yao, X. (2015). A survey on evolutionary computation approaches to feature selection. IEEE Transactions on evolutionary computation, 20(4), 606–626.

    Article  Google Scholar 

  5. Tubishat, M., Ja’afar, S., Alswaitti, M., Mirjalili, S., Idris, N., Ismail, M. A., & Omar, M. S. (2021). Dynamic salp swarm algorithm for feature selection. Expert Systems with Applications, 164, 113873.

    Article  Google Scholar 

  6. Kamath, U., De Jong, K., & Shehu, A. (2014). Effective automated feature construction and selection for classification of biological sequences. PLoS ONE, 9(7), e99982.

    Article  Google Scholar 

  7. Crone, S. F., & Kourentzes, N. (2010). Feature selection for time series prediction—A combined filter and wrapper approach for neural networks. Neurocomputing, 73(10–12), 1923–1936.

    Article  Google Scholar 

  8. Hu, Z., Bao, Y., Xiong, T., & Chiong, R. (2015). Hybrid filter–wrapper feature selection for short-term load forecasting. Engineering Applications of Artificial Intelligence, 40, 17–27.

    Article  Google Scholar 

  9. Wang, A., An, N., Chen, G., Li, L., & Alterovitz, G. (2015). Accelerating wrapper-based feature selection with K-nearest-neighbor. Knowledge-Based Systems, 83, 81–91.

    Article  Google Scholar 

  10. Jiménez-Cordero, A., Morales, J. M., & Pineda, S. (2021). A novel embedded min–max approach for feature selection in nonlinear support vector machine classification. European Journal of Operational Research, 293(1), 24–35.

    Article  MathSciNet  Google Scholar 

  11. Nemnes, G. A., Filipoiu, N., & Sipica, V. (2021). Feature selection procedures for combined density functional theory—artificial neural network schemes. Physica Scripta, 96(6), 065807.

    Article  Google Scholar 

  12. Xue, Y., Tang, Y., Xu, X., Liang, J., & Neri, F. (2021). Multi-objective feature selection with missing data in classification. IEEE Transactions on Emerging Topics in Computational Intelligence, 6(2), 355–364.

    Article  Google Scholar 

  13. Xue, Y., Tang, T., & Liu, A. X. (2019). Large-scale feedforward neural network optimization by a self-adaptive strategy and parameter based particle swarm optimization. IEEE Access, 7, 52473–52483.

    Article  Google Scholar 

  14. Aličković, E., & Subasi, A. (2017). Breast cancer diagnosis using GA feature selection and rotation forest. Neural Computing and Applications, 28, 753–763.

    Article  Google Scholar 

  15. Too, J., & Abdullah, A. R. (2021). A new and fast rival genetic algorithm for feature selection. The Journal of Supercomputing, 77, 2844–2874.

    Article  Google Scholar 

  16. Zhang, Y., Liu, R., Wang, X., Chen, H., & Li, C. (2021). Boosted binary Harris Hawks optimizer and feature selection. Engineering with Computers, 37, 3741–3770.

    Article  Google Scholar 

  17. Sun, L., Si, S., Zhao, J., Xu, J., Lin, Y., & Lv, Z. (2023). Feature selection using binary monarch butterfly optimization. Applied Intelligence, 53(1), 706–727.

    Article  Google Scholar 

  18. Houssein, E. H., Oliva, D., Celik, E., Emam, M. M., & Ghoniem, R. M. (2023). Boosted sooty tern optimization algorithm for global optimization and feature selection. Expert Systems with Applications, 213, 119015.

    Article  Google Scholar 

  19. Xu, Z., Heidari, A. A., Kuang, F., Khalil, A., Mafarja, M., Zhang, S., & Pan, Z. (2023). Enhanced Gaussian bare-bones grasshopper optimization: Mitigating the performance concerns for feature selection. Expert Systems with Applications, 212, 118642.

    Article  Google Scholar 

  20. Abualigah, L., & Diabat, A. (2022). Chaotic binary group search optimizer for feature selection. Expert Systems with Applications, 192, 116368.

    Article  Google Scholar 

  21. Taradeh, M., & Mafarja, M. (2020). Binary thermal exchange optimization for feature selection. Data management and analysis: case studies in education, healthcare and beyond (pp. 239–260). Cham: Springer.

    Chapter  Google Scholar 

  22. Yao, L., Yuan, P., Tsai, C. Y., Zhang, T., Lu, Y., & Ding, S. (2023). ESO: An enhanced snake optimizer for real-world engineering problems. Expert Systems with Applications, 230, 120594.

    Article  Google Scholar 

  23. Dehghani, M., Hubálovský, Š, & Trojovský, P. (2021). Northern goshawk optimization: A new swarm-based algorithm for solving optimization problems. IEEE Access, 9, 162059–162080.

    Article  Google Scholar 

  24. El-Dabah, M. A., El-Sehiemy, R. A., Hasanien, H. M., & Saad, B. (2023). Photovoltaic model parameters identification using northern goshawk optimization algorithm. Energy, 262, 125522.

    Article  Google Scholar 

  25. Liang, Y., Hu, X., Hu, G., & Dou, W. (2022). An enhanced northern goshawk optimization algorithm and its application in practical optimization problems. Mathematics, 10(22), 4383.

    Article  Google Scholar 

  26. Wang, J., Xiang, Z., Cheng, X., Zhou, J., & Li, W. (2023). Tool wear state identification based on SVM optimized by the improved northern goshawk optimization. Sensors, 23(20), 8591.

    Article  Google Scholar 

  27. Youssef, H., Kamel, S., Hassan, M. H., Yu, J., & Safaraliev, M. (2024). A smart home energy management approach incorporating an enhanced northern goshawk optimizer to enhance user comfort, minimize costs, and promote efficient energy consumption. International Journal of Hydrogen Energy, 49, 644–658.

    Article  Google Scholar 

  28. Dokeroglu, T., Deniz, A., & Kiziloz, H. E. (2022). A comprehensive survey on recent metaheuristics for feature selection. Neurocomputing, 494, 269–296.

    Article  Google Scholar 

  29. Ewees, A. A., Mostafa, R. R., Ghoniem, R. M., & Gaheen, M. A. (2022). Improved seagull optimization algorithm using Lévy flight and mutation operator for feature selection. Neural Computing and Applications, 34(10), 7437–7472.

    Article  Google Scholar 

  30. Ibrahim, R. A., Ewees, A. A., Oliva, D., Abd Elaziz, M., & Lu, S. (2019). Improved SALP swarm algorithm based on particle swarm optimization for feature selection. Journal of Ambient Intelligence and Humanized Computing, 10, 3155–3169.

    Article  Google Scholar 

  31. Zhang, Q., Gao, H., Zhan, Z. H., Li, J., & Zhang, H. (2023). Growth optimizer: A powerful metaheuristic algorithm for solving continuous and discrete global optimization problems. Knowledge-Based Systems, 261, 110206.

    Article  Google Scholar 

  32. Abdel-Basset, M., Mohamed, R., Jameel, M., & Abouhawwash, M. (2023). Nutcracker optimizer: A novel nature-inspired metaheuristic algorithm for global optimization and engineering design problems. Knowledge-Based Systems, 262, 110248.

    Article  Google Scholar 

  33. Abdel-Basset, M., Mohamed, R., Jameel, M., & Abouhawwash, M. (2023). Spider wasp optimizer: A novel meta-heuristic optimization algorithm. Artificial Intelligence Review, 56(10), 11675–11738.

    Article  Google Scholar 

  34. Ahmadianfar, I., Heidari, A. A., Noshadian, S., Chen, H., & Gandomi, A. H. (2022). INFO: An efficient optimization algorithm based on weighted mean of vectors. Expert Systems with Applications, 195, 116516.

    Article  Google Scholar 

  35. Cao, Z., Jia, H., Wang, Z., Foh, C. H., & Tian, F. (2024). A differential evolution with autonomous strategy selection and its application in remote sensing image denoising. Expert Systems with Applications, 238, 122108.

    Article  Google Scholar 

  36. Altay, O. (2022). Chaotic slime mould optimization algorithm for global optimization. Artificial Intelligence Review, 55(5), 3979–4040.

    Article  Google Scholar 

  37. Zhao, W., Zhang, Z., & Wang, L. (2020). Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications. Engineering Applications of Artificial Intelligence, 87, 103300.

    Article  Google Scholar 

  38. Hashim, F. A., & Hussien, A. G. (2022). Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowledge-Based Systems, 242, 108320.

    Article  Google Scholar 

  39. Zhao, W., Wang, L., & Mirjalili, S. (2022). Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Computer Methods in Applied Mechanics and Engineering, 388, 114194.

    Article  MathSciNet  Google Scholar 

  40. Abdollahzadeh, B., SoleimanianGharehchopogh, F., & Mirjalili, S. (2021). Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. International Journal of Intelligent Systems, 36(10), 5887–5958.

    Article  Google Scholar 

  41. Zhao, W., Wang, L., & Zhang, Z. (2020). Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural Computing and Applications, 32, 9383–9425.

    Article  Google Scholar 

  42. Rao, R. V., Savsani, V. J., & Vakharia, D. P. (2011). Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Computer-Aided Design, 43(3), 303–315.

    Article  Google Scholar 

  43. Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of ICNN'95-international conference on neural networks (Vol. 4, pp. 1942–1948). IEEE.

  44. Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja, M., & Chen, H. (2019). Harris Hawks optimization: Algorithm and applications. Future Generation Computer Systems, 97, 849–872.

    Article  Google Scholar 

  45. Arora, S., & Singh, S. (2019). Butterfly optimization algorithm: A novel approach for global optimization. Soft Computing, 23, 715–734.

    Article  Google Scholar 

  46. Mirjalili, S., & Lewis, A. (2016). The whale optimization algorithm. Advances in Engineering Software, 95, 51–67.

    Article  Google Scholar 

  47. Qi, X., Zhu, Y., & Zhang, H. (2017). A new meta-heuristic butterfly-inspired algorithm. Journal of Computational Science, 23, 226–239.

    Article  MathSciNet  Google Scholar 

  48. Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf optimizer. Advances in Engineering Software, 69, 46–61.

    Article  Google Scholar 

  49. Faramarzi, A., Heidarinejad, M., Stephens, B., & Mirjalili, S. (2020). Equilibrium optimizer: A novel optimization algorithm. Knowledge-Based Systems, 191, 105190.

    Article  Google Scholar 

  50. Abdel-Basset, M., Mohamed, R., Zidan, M., Jameel, M., & Abouhawwash, M. (2023). Mantis search algorithm: A novel bio-inspired algorithm for global optimization and engineering design problems. Computer Methods in Applied Mechanics and Engineering, 415, 116200.

    Article  MathSciNet  Google Scholar 

  51. Abdollahzadeh, B., Gharehchopogh, F. S., & Mirjalili, S. (2021). African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Computers & Industrial Engineering, 158, 107408.

    Article  Google Scholar 

  52. Trojovský, P., & Dehghani, M. (2022). Pelican optimization algorithm: A novel nature-inspired algorithm for engineering applications. Sensors, 22(3), 855.

    Article  Google Scholar 

Download references

Funding

This work is supported in part by the National Natural Science Foundation of China's top-level program under grant No. 52275480, in part by Reserve projects for centralized guidance of local science and technology development funds under grant No. QKHZYD [2023]002.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization, R.X. and F.W.; methodology, R.X. and F.W.; software, R.X.; validation, R.X.; formal analysis, R.X. and F.W.; investigation, R.X.; resources, S.L.; writing original draft preparation, R.X.; writing review and editing, R.X.; supervision, S.L. and F.W.; project administration, S.L.; funding acquisition, S.L. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Shaobo Li.

Ethics declarations

Conflict of Interest

We would like to submit our manuscript entitled “An Improved Northern Goshawk Optimization Algorithm for Feature Selection” to “Journal of Bionic Engineering” for publication. No conflict of interest exists in the submission of this manuscript.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xie, R., Li, S. & Wu, F. An Improved Northern Goshawk Optimization Algorithm for Feature Selection. J Bionic Eng (2024). https://doi.org/10.1007/s42235-024-00515-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s42235-024-00515-5

Keywords

Navigation