skip to main content
10.1145/1276958.1277068acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
Article

Addressing sampling errors and diversity loss in UMDA

Published:07 July 2007Publication History

ABSTRACT

Estimation of distribution algorithms replace the typical crossover and mutation operators by constructing a probabilistic model and generating offspring according to this model. In previous studies, it has been shown that this generally leads to diversity loss due to sampling errors. In this paper, for the case of the simple Univariate Marginal Distribution Algorithm (UMDA), we propose and test several methods for counteracting diversity loss. The diversity loss can come in two phases: sampling from the probability model (offspring generation) and selection. We show that it is possible to completely remove the sampling error during offspring generation. Furthermore, we examine several plausible model construction variants which counteract diversity loss during selection and demonstrate that these update rules work better than the standard update on a variety of simple test problems.

References

  1. J. E. Baker. Reducing bias and inefficiency in the selection algorithm. In J. Grefenstette, editor, International Conference on Genetic Algorithms, pages 14--21. Lawrence Erlbaum Associates, 1987. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. S. Baluja and S. Davies. Using optimal dependency-trees for combinatorial optimization: Learing the structure of the search space. In D. Fisher, editor, International Conference on Machine Learning, pages 30--38, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. J. Branke, M. Cutaia, and H. Dold. Reducing genetic drift in steady state evolutionary algorithms. In Wolfgang Banzhaf et al., editors, Genetic and Evolutionary Computation Conference, pages 68--74. Morgan Kaufmann, 1999.Google ScholarGoogle Scholar
  4. C. Gonzalez, J. Lozano, and P. LarraÜnaga. Analyzing the population based incremental learning algorithm by means of discrete dynamical systems. Complex Systems, 12(4):465--479, 2001.Google ScholarGoogle Scholar
  5. C. Gonzalez, J. Lozano, and P. LarraÜnaga. The convergence behaviour of the PBIL algorithm: a preliminary approach. In International Conference on Neural Networks and Genetic Algorithms, pages 228--231, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  6. P. LarraÜnaga and J. A. Lozano. Estimation of Distribution Algorithms, A New Tool for Evolutionary Computation. Kluwer Academic Publishers, 2002.Google ScholarGoogle Scholar
  7. T. Mahnig and H. Mühlenbein. Optimal mutation rate using bayesian priors for estimation of distribution algorithms. In Stochastic Algorithms: Foundations and Applications, volume 2264 of LNCS, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. H. Mühlenbein and G. Paaß. From recombination of genes to the estimation of distributions i: Binary parameters. In Parallel Problem Solving from Nature, pages 178--187, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. M. Pelikan, D. E. Goldberg, and F. Lobo. A survey of optimization by building and using probabilistic models. Computational Optimization and Applications, 21(1):5--20, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. J. L. Shapiro. Scaling of probability-based optimization algorithms. In Klaus Obermayer, editor, Advances in Neural Information Processing Systems 15, pages 399--406. MIT Press, 2003.Google ScholarGoogle Scholar
  11. J. L. Shapiro. The detailed balance principle in estimation of distribution algorhithm. Evolutionary Computation, 13(1):99--124, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. J. L. Shapiro. Diversity loss in general estimation of distribution algorithms. In Parallel Problem Solving from Nature, volume 4193 of LNCS, pages 92--101. Springer, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. H. Wu and J. L. Shapiro. Does overfitting affect performance in estimation of distribution algorithms. In Genetic and Evolutionary Computation Conference, pages 433--434. ACM, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Addressing sampling errors and diversity loss in UMDA

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computation
          July 2007
          2313 pages
          ISBN:9781595936974
          DOI:10.1145/1276958

          Copyright © 2007 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 7 July 2007

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • Article

          Acceptance Rates

          GECCO '07 Paper Acceptance Rate266of577submissions,46%Overall Acceptance Rate1,669of4,410submissions,38%

          Upcoming Conference

          GECCO '24
          Genetic and Evolutionary Computation Conference
          July 14 - 18, 2024
          Melbourne , VIC , Australia

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader