Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

A critical problem in benchmarking and analysis of evolutionary computation methods

Abstract

Benchmarking is a cornerstone in the analysis and development of computational methods, especially in the field of evolutionary computation, where theoretical analysis of the algorithms is almost impossible. In this Article, we show that some of the frequently used benchmark functions have their respective optima in the centre of the feasible set and that this poses a critical problem for the analysis of evolutionary computation methods. We carry out an analysis of seven recently published methods and find that these contain a centre-bias operator that lets them find optima in the centre of the benchmark set with ease. However, this mechanism makes their comparison with other methods (that do not have a centre-bias) meaningless. We compare the computational performance of these seven new methods to two long-standing ones in evolutionary computation (‘differential evolution’ and ‘particle swarm optimization’) on shifted problems and on more advanced benchmark problems. Only one of the seven methods performed consistently better than the pair of old methods, three performed on par, two performed very badly and the worst one performed barely better than a random search. We provide several suggestions that could help to improve analysis and benchmarking in evolutionary computation.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Results of the Glicko-2 rating on the ambiguous benchmark set.

Similar content being viewed by others

Data availability

Data used for the benchmark functions will be made available at https://doi.org/10.24433/CO.1268126.v1 (ref. 30).

Code availability

The code that supports the findings of this study will be made available at https://doi.org/10.24433/CO.1268126.v1 (ref. 30).

References

  1. Campelo, F. & Aranha, C. Evolutionary computation bestiary. https://github.com/fcampelo/EC-Bestiary (accessed 7 February 2022).

  2. Weyland, D. A rigorous analysis of the harmony search algorithm: how the research community can be misled by a novel methodology. Int. J. Appl. Metaheuristic Comput. 12, 50–60 (2010).

    Article  Google Scholar 

  3. Camacho Villalón, C. L., Dorigo, M. & Stützle, T. The intelligent water drops algorithm: why it cannot be considered a novel algorithm. Swarm Intell. 13, 173–192 (2019).

    Article  Google Scholar 

  4. Camacho Villalón, C. L., Stützle, T. & Dorigo, M. Grey wolf, firefly and bat algorithms: three widespread algorithms that do not contain any novelty. In Int. Conference on Swarm Intelligence 121–133 (Springer, 2020).

  5. Camacho Villalón, C. L., Stützle, T. & Dorigo, M. Cuckoo Search ≡ μ+λ – Evolution Strategy — A Rigorous Analysis of an Algorithm that has Been Misleading the Research Community for More Than 10 Years and Nobody Seems to have Noticed TR/IRIDIA/2021-006 (IRIDIA, Université Libre de Bruxelles, 2021).

  6. Piotrowski, A. P., Napiorkowski, J. J. & Rowinski, P. M. How novel is the “novel” black hole optimization approach? Inf. Sci. 267, 191–200 (2014).

    Article  Google Scholar 

  7. Aranha, C. et al. Metaphor‑based metaheuristics, a call for action: the elephant in the room. Swarm Intell. 16, 1–6 (2022).

    Article  Google Scholar 

  8. Hellwig, M. & Beyer, H. G. Benchmarking evolutionary algorithms for single objective real-valued constrained optimization – a critical review. Swarm Evol. Comput. 44, 927–944 (2019).

    Article  Google Scholar 

  9. Garcia-Martinez, C., Gutierrez, P. D., Molina, D., Lozano, M. & Herrera, F. Since CEC 2005 competition on real-parameter optimisation: a decade of research, progress and comparative analysis’s weakness. Soft Comput. 21, 5573–5583 (2017).

    Article  Google Scholar 

  10. Hansen, N., Auger, A., Mersmann, O., Tuvar, T. & Brockhoff, D. COCO: a platform for comparing continuous optimizers in a black-box setting. Preprint at https://arxiv.org/abs/1603.08785 (2016).

  11. Suganthan, N. P. Github repository of CEC competitions. GitHub https://github.com/P-N-Suganthan (2022).

  12. Garden, R. W. & Engelbrecht, A. P. Analysis and classification of optimization benchmark functions and benchmark suites. In IEEE Congress on Evolutionary Computation 1664–1669 (2014).

  13. COCO Data Archives (2022); https://numbbo.github.io/data-archive/

  14. Piotrowski, A. P. Regarding the rankings of optimization heuristics based on artificially-constructed benchmark functions. Inf. Sci. 297, 191–201 (2015).

    Article  Google Scholar 

  15. Tzanetos, A. & Dounias, G. Nature inspired optimization algorithms or simply variations of metaheuristics? Artif. Intell. Rev. 54, 1841–1862 (2021).

    Article  MATH  Google Scholar 

  16. Kumar, A., Suganthan, P. N., Mohamed, A. W., Hadi, A. A. & Mohamed, A. K. Special session & competitions on single objective bound constrained numerical optimization. In IEEE Congress on Evolutionary Computation (IEEE, 2021).

  17. Niu, P., Niu, S., Liu, N. & Chang, L. The defect of the Grey Wolf optimization algorithm and its verification method. Knowl.-Based Syst. 171, 37–43 (2019).

    Article  Google Scholar 

  18. Castelli, M., Manzoni, L., Mariot, L., Nobile, M. S. & Tangherloni, A. Salp Swarm Optimization: a critical review. Expert Syst. Appl. 189, 116029 (2022).

  19. Kudela, J. Commentary on: “STOA: A bio-inspired based optimization algorithm for industrial engineering problems” [EAAI, 82 (2019), 148–174] and “Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization” [EAAI, 90 (2020), no. 103541]. Eng. Appl. Artif. Intell. 113, 104930 (2022).

  20. Suyanto, S., Ariyanto, A. A. & Ariyanto, A. F. Komodo Mlipir Algorithm. Appl. Soft Comput. 114, 108043 (2022).

  21. Li, S., Chen, H., Wang, M., Heidari, A. A. & Mirjalili, S. Slime mould algorithm: a new method for stochastic optimization. Future Gener. Comput. Syst. 111, 300–323 (2020).

    Article  Google Scholar 

  22. Arora, S. & Singh, S. Butterfly optimization algorithm: a novel approach for global optimization. Soft Comput. 23, 715–734 (2019).

    Article  Google Scholar 

  23. Ahmadianfar, I., Bozorg-Haddad, O. & Chu, X. Gradient-based optimizer: a new metaheuristic optimization algorithm. Inf. Sci. 540, 131–159 (2020).

    Article  MathSciNet  MATH  Google Scholar 

  24. Oszust, M. Enhanced marine predators algorithm with local escaping operator for global optimization. Knowl.-Based Syst. 232, 107467 (2021).

    Article  Google Scholar 

  25. Heidari, A. A. et al. Harris hawks optimization: algorithm and applications. Future Gener. Comput. Syst. 97, 849–872 (2019).

    Article  Google Scholar 

  26. Dhiman, G. & Kaur, A. STOA: a bio-inspired based optimization algorithm for industrial engineering problems. Eng. Appl. Artif. Intell. 82, 148–174 (2019).

    Article  Google Scholar 

  27. Tanabe, R. & Fukunaga, A. Improving the search performance of SHADE using linear population size reduction. In 2014 IEEE Congress on Evolutionary Computation 1658–1665 (IEEE, 2014).

  28. Zhang, G. & Shi, Y. Hybrid sampling evolution strategy for solving single objective bound constrained problems. In 2018 IEEE Congress on Evolutionary Computation (IEEE, 2018).

  29. Fister, I. et al. On selection of a benchmark by determining the algorithms’ qualities. IEEE Access 9, 51166–51178 (2021).

    Article  Google Scholar 

  30. CodeOcean Capsule (2022); https://doi.org/10.24433/CO.1268126.v1

  31. Bayzidi, H., Talatahari, S., Saraee, M. & Lamarche, C.-P. Social network search for solving engineering optimization problems. Comput. Intell. Neurosc. 9, 8548639 (2021).

  32. Kudela, J. & Matousek, R. New benchmark functions for single-objective optimization based on a zigzag pattern. IEEE Access 10, 8262–8278 (2022).

    Article  Google Scholar 

  33. Vecek, N., Crepinsek, M., Mernik, M. & Hrncic, D. A comparison between different chess rating systems for ranking evolutionary algorithms. In 2014 Federated Conference on Computer Science and Information Systems 511–518 (IEEE, 2014).

  34. Del Ser, J. et al. More is not always better: insights from a massive comparison of meta-heuristic algorithms over real-parameter optimization problems. In IEEE Symposium Series on Computational Intelligence (IEEE, 2021).

  35. Scipy benchmark functions. GitHub https://github.com/scipy/scipy/tree/main/benchmarks/benchmarks/go_benchmark_functions (2022).

  36. Tzanetos, A. & Dounias, G. A comprehensive survey on the applications of swarm intelligence and bio-inspired evolutionary strategies. Mach. Learn. Paradigms 18, 337–378 (2020).

    Article  Google Scholar 

  37. Gleixner, A. et al. MIPLIB 2017: data-driven compilation of the 6th mixed-integer programming library. Math. Program. Comput. 13, 443–490 (2021).

    Article  MathSciNet  MATH  Google Scholar 

  38. Mohamed, A.W. et al. Problem Definitions and Evaluation Criteria for the CEC 2021 Special Session and Competition on Single Objective Bound Constrained Numerical Optimization (Cairo University, 2020).

  39. Yue, C.T. et al. Problem Definitions and Evaluation Criteria for the CEC 2020 Special Session and Competition on Single Objective Bound Constrained Numerical Optimization Technical report 201911 (Computational Intelligence Laboratory, Zhengzhou University, 2019).

  40. Kudela, J. Novel zigzag-based benchmark functions for bound constrained single objective optimization. In 2021 IEEE Congress on Evolutionary Computation (IEEE, 2021).

  41. Vecek, N., Crepinsek, M. & Mernik, M. On the influence of the number of algorithms, problems, and independent runs in the comparison of evolutionary algorithms. Appl. Soft Comput. 54, 23–45 (2017).

    Article  Google Scholar 

  42. Osaba, E. et al. A tutorial on the design, experimentation and application of metaheuristic algorithms to real-world optimization problems. Swarm Evol. Comput. 64, 100888 (2021).

  43. Doerr, C., Wang, H., Ye, F., van Rijn, S. & Back, T. IOHprofiler: a benchmarking and profiling tool for iterative optimization heuristics. Preprint at https://arxiv.org/abs/1810.05281 (2018).

Download references

Acknowledgements

This work was supported by the Grant Agency of the Czech Republic project 22-31173S and by the Brno University of Technology project FSI-S-20-6538.

Author information

Authors and Affiliations

Authors

Contributions

J.K. performed the conceptualization, design, data analysis and interpretation, drafting of the manuscript and critical revision of the manuscript for important intellectual content.

Corresponding author

Correspondence to Jakub Kudela.

Ethics declarations

Competing interests

The author declares no competing interests.

Peer review

Peer review information

Nature Machine Intelligence thanks Alexandros Tzanetos and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary discussion and Tables 1–4.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kudela, J. A critical problem in benchmarking and analysis of evolutionary computation methods. Nat Mach Intell 4, 1238–1245 (2022). https://doi.org/10.1038/s42256-022-00579-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-022-00579-0

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics