Skip to main content

Surrogate-Assisted LSHADE Algorithm Utilizing Recursive Least Squares Filter

  • Conference paper
  • First Online:
Parallel Problem Solving from Nature – PPSN XVII (PPSN 2022)

Abstract

Surrogate-assisted (meta-model based) algorithms are dedicated to expensive optimization, i.e., optimization in which a single Fitness Function Evaluation (FFE) is considerably time-consuming. Meta-models allow to approximate the FFE value without its exact calculation. However, their effective incorporation into Evolutionary Algorithms remains challenging, due to a trade-off between accuracy and time complexity. In this paper we present the way of recursive meta-model incorporation into LSHADE (rmmLSHADE) using a Recursive Least Squares (RLS) filter. The RLS filter updates meta-model coefficients on a sample-by-sample basis, with no use of an archive of samples. The performance of rmmLSHADE is measured using the popular CEC2021 benchmark in expensive scenario, i.e. with the optimization budget of \(10^3\cdot D\), where D is the problem dimensionality. rmmLSHADE is compared with the baseline LSHADE and with psLSHADE – a novel algorithm designed specifically for expensive optimization. Experimental evaluation shows that rmmLSHADE distinctly outperforms both algorithms. In addition, the impact of the forgetting factor (RLS filter parameter) on algorithm performance is examined and the runtime analysis of rmmLSHADE is presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Atkeson, C.G., Moore, A.W., Schaal, S.: Locally weighted learning. In: Aha, D.W. (ed.) Lazy Learning, pp. 11–73. Springer, Dordrecht (1997). https://doi.org/10.1007/978-94-017-2053-3_2

  2. Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: 2005 IEEE Congress on Evolutionary Computation, vol. 2, pp. 1769–1776. IEEE (2005)

    Google Scholar 

  3. Auger, A., Schoenauer, M., Vanhaecke, N.: LS-CMA-ES: a second-order algorithm for covariance matrix adaptation. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 182–191. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30217-9_19

    Chapter  Google Scholar 

  4. Awad, N.H., Ali, M.Z., Suganthan, P.N.: Ensemble sinusoidal differential covariance matrix adaptation with Euclidean neighborhood for solving CEC2017 benchmark problems. In: 2017 IEEE Congress on Evolutionary Computation (CEC), pp. 372–379. IEEE (2017)

    Google Scholar 

  5. Bajer, L., Pitra, Z., Holeňa, M.: Benchmarking gaussian processes and random forests surrogate models on the bbob noiseless testbed. In: Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 1143–1150 (2015)

    Google Scholar 

  6. Bajer, L., Pitra, Z., Repickỳ, J., Holeňa, M.: Gaussian process surrogate models for the CMA evolution strategy. Evol. Comput. 27(4), 665–697 (2019)

    Article  Google Scholar 

  7. Biswas, S., Saha, D., De, S., Cobb, A.D., Das, S., Jalaian, B.A.: Improving differential evolution through Bayesian hyperparameter optimization. In: 2021 IEEE Congress on Evolutionary Computation (CEC), pp. 832–840. IEEE (2021)

    Google Scholar 

  8. Bliek, L., Guijt, A., Verwer, S., De Weerdt, M.: Black-box mixed-variable optimisation using a surrogate model that satisfies integer constraints. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1851–1859 (2021)

    Google Scholar 

  9. Bliek, L., Verwer, S., de Weerdt, M.: Black-box combinatorial optimization using models with integer-valued minima. Ann. Math. Artif. Intell. 89(7), 639–653 (2020). https://doi.org/10.1007/s10472-020-09712-4

    Article  MathSciNet  MATH  Google Scholar 

  10. Chen, Q., Liu, B., Zhang, Q., Liang, J., Suganthan, P., Qu, B.: Problem definitions and evaluation criteria for CEC 2015 special session on bound constrained single-objective computationally expensive numerical optimization. Technical report, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Technical report, Nanyang Technological University (2014)

    Google Scholar 

  11. Cressie, N.: The origins of kriging. Math. Geol. 22(3), 239–252 (1990)

    Article  MathSciNet  Google Scholar 

  12. Hansen, N.: A global surrogate assisted CMA-ES. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 664–672 (2019)

    Google Scholar 

  13. Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)

    Article  Google Scholar 

  14. Helton, J.C., Davis, F.J.: Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliabil. Eng. Syst. Saf. 81(1), 23–69 (2003)

    Article  Google Scholar 

  15. Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)

    Article  Google Scholar 

  16. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  Google Scholar 

  17. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN’95-International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)

    Google Scholar 

  18. Kern, S., Hansen, N., Koumoutsakos, P.: Local meta-models for optimization using evolution strategies. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 939–948. Springer, Heidelberg (2006). https://doi.org/10.1007/11844297_95

    Chapter  Google Scholar 

  19. Mohamed, A.W., Hadi, A.A., Mohamed, A.K., Agrawal, P., Kumar, A., Suganthan, P.: Problem definitions and evaluation criteria for the CEC 2021 special session and competition on single objective bound constrained numerical optimization. https://github.com/P-N-Suganthan/2021-SO-BCO/blob/main/CEC2021%20TR_final%20(1).pdf

  20. Nishida, K., Akimoto, Y.: Benchmarking the PSA-CMA-ES on the BBOB noiseless testbed. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1529–1536 (2018)

    Google Scholar 

  21. Sallam, K.M., Elsayed, S.M., Chakrabortty, R.K., Ryan, M.J.: Improved multi-operator differential evolution algorithm for solving unconstrained problems. In: 2020 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE (2020)

    Google Scholar 

  22. Sayed, A.H., Kailath, T.: Recursive least-squares adaptive filters. Digit. Sig. Process. Handb. 21(1) (1998)

    Google Scholar 

  23. Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11(4), 341–359 (1997)

    Article  MathSciNet  Google Scholar 

  24. Tanabe, R., Fukunaga, A.: Success-history based parameter adaptation for differential evolution. In: 2013 IEEE Congress on Evolutionary Computation, pp. 71–78. IEEE (2013)

    Google Scholar 

  25. Tanabe, R., Fukunaga, A.S.: Improving the search performance of SHADE using linear population size reduction. In: 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1658–1665. IEEE (2014)

    Google Scholar 

  26. Vikhar, P.A.: Evolutionary algorithms: a critical review and its future prospects. In: 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC), pp. 261–265. IEEE (2016)

    Google Scholar 

  27. Weisberg, S.: Applied Linear Regression. Wiley, Hoboken (2013)

    Google Scholar 

  28. Yamaguchi, T., Akimoto, Y.: Benchmarking the novel CMA-ES restart strategy using the search history on the BBOB noiseless testbed. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1780–1787 (2017)

    Google Scholar 

  29. Zaborski, M., Mańdziuk, J.: Improving LSHADE by means of a pre-screening mechanism. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2022, Boston, Massachusetts, pp. 884–892. Association for Computing Machinery (2022). https://doi.org/10.1145/3512290.3528805

  30. Zaborski, M., Mańdziuk, J.: LQ-R-SHADE: R-SHADE with quadratic surrogate model. In: Proceedings of the 21st International Conference on Artificial Intelligence and Soft Computing (ICAISC 2022), Zakopane, Poland (2022)

    Google Scholar 

  31. Zaborski, M., Okulewicz, M., Mańdziuk, J.: Analysis of statistical model-based optimization enhancements in generalized self-adapting particle swarm optimization framework. Found. Comput. Decis. Sci. 45(3), 233–254 (2020)

    Article  Google Scholar 

  32. Zhang, J., Sanderson, A.C.: JADE: adaptive differential evolution with optional external archive. IEEE Trans. Evol. Comput. 13(5), 945–958 (2009)

    Article  Google Scholar 

Download references

Acknowledgments

Studies were funded by BIOTECHMED-1 project granted by Warsaw University of Technology under the program Excellence Initiative: Research University (ID-UB).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mateusz Zaborski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zaborski, M., Mańdziuk, J. (2022). Surrogate-Assisted LSHADE Algorithm Utilizing Recursive Least Squares Filter. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-14714-2_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-14713-5

  • Online ISBN: 978-3-031-14714-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics