skip to main content
10.1145/3594805.3607128acmconferencesArticle/Chapter ViewAbstractPublication PagesfogaConference Proceedingsconference-collections
research-article
Open Access

Self-adaptation Can Improve the Noise-tolerance of Evolutionary Algorithms

Published:30 August 2023Publication History

ABSTRACT

Real-world optimisation often involves uncertainty. Previous studies proved that evolutionary algorithms (EAs) can be robust to noise when using proper parameter settings, including the mutation rate. However, finding the appropriate mutation rate is challenging if the occurrence of noise (or noise level) is unknown. Self-adaptation is a parameter control mechanism which adjusts mutation rates by encoding mutation rates in the genomes of individuals and evolving them. It has been proven to be effective in optimising unknown-structure and multi-modal problems. Despite this, a rigorous study of self-adaptation in noisy optimisation is missing. This paper mathematically analyses the runtimes of 2-tournament EAs with self-adapting two mutation rates, fixed mutation rates and uniformly chosen mutation rate from two given rates on LeadingOnes with and without symmetric noise. Results show that using self-adaptation achieves the lowest runtime regardless of the presence of symmetric noise. In supplemental experiments, we extend analyses to other types of noise, i.e., one-bit and bit-wise noise. We also consider another self-adaptation mechanism, which adapts the mutation rate from a given interval. Self-adaptive EAs adapt their mutation rate to the noise level and outperform static EAs in these experiments. Overall, self-adaptation can improve the noise-tolerance of EAs in the noise-models studied here.

References

  1. Brendan Case and Per Kristian Lehre. 2020. Self-adaptation in non-Elitist Evolutionary Algorithms on Discrete Problems with Unknown Structure. IEEE Transactions on Evolutionary Computation 24, 4 (2020), 650--663.Google ScholarGoogle ScholarCross RefCross Ref
  2. Dogan Corus, Duc-Cuong Dang, Anton Eremeev, and Per Kristian Lehre. 2018. Level-Based Analysis of Genetic Algorithms and Other Search Processes. IEEE Transactions on Evolutionary Computation 22, 5 (2018), 707--719.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Duc-Cuong Dang, Anton Eremeev, and Per Kristian Lehre. 2021. Non-Elitist Evolutionary Algorithms Excel in Fitness Landscapes with Sparse Deceptive Regions and Dense Valleys. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '21). Association for Computing Machinery, 1133--1141.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Duc-Cuong Dang and Per Kristian Lehre. 2015. Efficient Optimisation of Noisy Fitness Functions with Population-based Evolutionary Algorithms. In Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII - FOGA '15. ACM Press, 62--68.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Duc-Cuong Dang and Per Kristian Lehre. 2016. Self-adaptation of Mutation Rates in Non-elitist Populations. In Parallel Problem Solving from Nature -- PPSN XIV, Vol. 9921. Springer International Publishing, 803--813.Google ScholarGoogle Scholar
  6. Raphaël Dang-Nhu, Thibault Dardinier, Benjamin Doerr, Gautier Izacard, and Dorian Nogneng. 2018. A New Analysis Method for Evolutionary Optimization of Dynamic and Noisy Objective Functions. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '18). Association for Computing Machinery, 1467--1474.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Benjamin Doerr and Carola Doerr. 2018. Optimal Static and Self-Adjusting Parameter Choices for the (1 + (λ, λ)) Genetic Algorithm. Algorithmica 80, 5 (2018), 1658--1709.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Benjamin Doerr, Carola Doerr, and Johannes Lengler. 2019. Serf-Adjusting Mutation Rates with Provably Optimal Success Rules. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '19). Association for Computing Machinery, 1479--1487.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Benjamin Doerr and Timo Kötzing. 2021. Multiplicative Up-Drift. Algorithmica 83, 10 (2021), 3017--3058.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Benjamin Doerr, Carsten Witt, and Jing Yang. 2021. Runtime Analysis for Self-adaptive Mutation Rates. Algorithmica 83, 4 (2021), 1012--1053.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Carola Doerr. 2020. Complexity Theory for Discrete Black-Box Optimization Heuristics. In Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, Benjamin Doerr and Frank Neumann (Eds.). Springer International Publishing, 133--212.Google ScholarGoogle Scholar
  12. Stefan Droste. 2004. Analysis of the (1+1) EA for a Noisy OneMax. In Genetic and Evolutionary Computation -- GECCO 2004. Vol. 3102. Springer Berlin Heidelberg, 1088--1099.Google ScholarGoogle ScholarCross RefCross Ref
  13. Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science 276, 1 (2002), 51--81.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Tobias Friedrich, Timo Kötzing, Martin S. Krejca, and Andrew M. Sutton. 2016. Robustness of Ant Colony Optimization to Noise. Evolutionary Computation 24, 2 (2016), 237--254.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Christian Gießen and Timo Kötzing. 2016. Robustness of Populations in Stochastic Environments. Algorithmica 75, 3 (2016), 462--489.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Mario Alejandro Hevia Fajardo and Dirk Sudholt. 2021. Self-Adjusting Offspring Population Sizes Outperform Fixed Parameters on the Cliff Function. In Proceedings of the 16th ACM/SIGEVO Conference on Foundations of Genetic Algorithms (FOGA 21). Association for Computing Machinery.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Mario Alejandro Hevia Fajardo and Dirk Sudholt. 2022. Hard Problems Are Easier for Success-Based Parameter Control. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 22). Association for Computing Machinery, 796--804.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Yaochu Jin and Jürgen Branke. 2005. Evolutionary optimization in uncertain environments-a survey. IEEE Transactions on Evolutionary Computation 9, 3 (2005), 303--317.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Per Kristian Lehre. 2010. Negative Drift in Populations. In Parallel Problem Solving from Nature, PPSN XI. Springer Berlin Heidelberg, 244--253.Google ScholarGoogle Scholar
  20. Per Kristian Lehre and Phan Trung Hai Nguyen. 2021. Runtime Analyses of the Population-Based Univariate Estimation of Distribution Algorithms on LeadingOnes. Algorithmica 83, 10 (2021), 3238--3280.Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Per Kristian Lehre and Xiaoyu Qin. 2021. More Precise Runtime Analyses of Non-elitist EAs in Uncertain Environments. In Proceedings of the Genetic and Evolutionary Computation Conference. ACM, 1160--1168.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Per Kristian Lehre and Xiaoyu Qin. 2022. More Precise Runtime Analyses of Non-elitist Evolutionary Algorithms in Uncertain Environments. Algorithmica (2022).Google ScholarGoogle Scholar
  23. Per Kristian Lehre and Xiaoyu Qin. 2022. Self-Adaptation via Multi-Objectivisation: A Theoretical Study. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 22). Association for Computing Machinery, 1417--1425.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Constantin P. Niculescu and Andrei Vernescu. 2004. A two sided estimate of ex -- (1 + x/n)n. Journal of Inequalities in Pure and Applied Mathematics 5, 3 (2004).Google ScholarGoogle Scholar
  25. Chao Qian, Chao Bian, Wu Jiang, and Ke Tang. 2019. Running Time Analysis of the (1+1)-EA for OneMax and LeadingOnes Under Bit-Wise Noise. Algorithmica 81, 2 (2019), 749--795.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Chao Qian, Chao Bian, Yang Yu, Ke Tang, and Xin Yao. 2021. Analysis of Noisy Evolutionary Optimization When Sampling Fails. Algorithmica 83, 4 (2021), 940--975.Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Chao Qian, Yang Yu, Ke Tang, Yaochu Jin, Xin Yao, and Zhi-Hua Zhou. 2018. On the Effectiveness of Sampling for Evolutionary Optimization in Noisy Environments. Evolutionary Computation 26, 2 (2018), 237--267.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Xiaoyu Qin and Per Kristian Lehre. 2022. Self-adaptation via Multi-objectivisation: An Empirical Study. In Parallel Problem Solving from Nature -- PPSN XVII. Springer International Publishing, 308--323.Google ScholarGoogle Scholar
  29. Amirhossein Rajabi and Carsten Witt. 2020. Self-Adjusting Evolutionary Algorithms for Multimodal Optimization. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference (GECCO 20). Association for Computing Machinery, 1314--1322.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Dirk Sudholt. 2013. A New Method for Lower Bounds on the Running Time of Evolutionary Algorithms. IEEE Transactions on Evolutionary Computation 17, 3 (2013), 418--435.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Dirk Sudholt. 2021. Analysing the Robustness of Evolutionary Algorithms to Noise: Refined Runtime Bounds and an Example Where Noise is Beneficial. Algorithmica 83, 4 (2021), 976--1011.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Self-adaptation Can Improve the Noise-tolerance of Evolutionary Algorithms

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      FOGA '23: Proceedings of the 17th ACM/SIGEVO Conference on Foundations of Genetic Algorithms
      August 2023
      169 pages
      ISBN:9798400702020
      DOI:10.1145/3594805

      Copyright © 2023 Owner/Author

      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 30 August 2023

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate72of131submissions,55%
    • Article Metrics

      • Downloads (Last 12 months)104
      • Downloads (Last 6 weeks)15

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader