skip to main content
10.1145/2330163.2330259acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

A genetic algorithm for designing neural network ensembles

Published:07 July 2012Publication History

ABSTRACT

Ensemble Methods (EMs) are sets of models that combine their decisions, or their learning algorithms, or different data to obtain good predictions. The motivations are the possibility of improving the generalization capability and the overall system performance. However, several issues are at stake in EM development, such as the design of models that disagree as much as possible on the same data, the selection of some of them and their optimal combination to enhance the robustness of the ensemble. Since there is no unified procedure to implement these steps, this paper proposes a new methodology to design Neural Network (NN) ensembles using a Genetic Algorithm (GA). Firstly, a set of NNs with high degree of diversity is produced. The aim is to draw a different training data set for each NN by applying bootstrap. The architecture of the NN is selected by varying the number of hidden neurons, activation functions and initialization of weights. Secondly, a GA is employed to select both the best subset of NNs and the optimal combination strategy for ensuring the accuracy and the robustness of the ensemble. Experiments on well-known data sets are reported to evaluate the effectiveness of the proposed methodology.

References

  1. D. A. Belsley, E. Kuh, and R. E. Welsch. Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. Wiley, 1980.Google ScholarGoogle ScholarCross RefCross Ref
  2. L. Breiman. Bagging predictors. Machine Learning, 24(2):123--140, August 1996. Google ScholarGoogle ScholarCross RefCross Ref
  3. G. Brown, J. Wyatt, R. Harris, and X. Yao. Diversity creation methods: A survey and categorisation. Information Fusion, 6(1):5--20, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  4. P. A. D. Castro and F. J. V. Zuben. Learning ensembles of neural networks by means of a bayesian artificial immune system. IEEE Transactions on Neural Networks, 22(2):304--316, February 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. A. Chandra, H. Chen, and X. Yao. Trade-off between diversity and accuracy in ensemble generation. In Y. Jin, editor, Multi-Objective Machine Learning, volume 16 of Studies in Computational Intelligence, pages 429--464. Springer Berlin / Heidelberg, 2006.Google ScholarGoogle Scholar
  6. A. L. V. Coelho and D. S. C. Nascimento. On the evolutionary design of heterogeneous bagging models. Neurocomputing, 73(16-18):3319--3322, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. T. G. Dietterich. Ensemble methods in machine learning. In J. Kittler and F. Roli, editors, Proceedings of the First International Workshop on (MCS 2000) Multiple Classifier Systems, volume 1857, pages 1--15. Springer-Verlag, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Dondeti, K. Kannan, and R. Manavalan. Genetic algorithm optimized neural networks ensemble for estimation of mefenamic acid and paracetamol in tablets. Acta Chimica Slovenica, 52:440--449, 2005.Google ScholarGoogle Scholar
  9. B. Efron and R. J. Tibshirani. An Introduction to the Bootstrap. Chapman & Hall, New York, 1993.Google ScholarGoogle ScholarCross RefCross Ref
  10. L. Fortuna, S. Graziani, and M. G. Xibilia. Comparison of soft-sensor design methods for industrial plants using small data sets. IEEE Transactions on Instrumentation and Measurement, 58(8):2444--2451, August 2009.Google ScholarGoogle ScholarCross RefCross Ref
  11. Y. Freund and R. Schapire. A short introduction to boosting. Japanese Society for Artificial Intelligence, 14(5):771--780, 1999.Google ScholarGoogle Scholar
  12. J. H. Friedman. Multivariate adaptive regression splines. The Annals of Statistics, 19(1):1--67, 1991.Google ScholarGoogle ScholarCross RefCross Ref
  13. M. T. Hagan, H. B. Demuth, and M. Beale. Neural Network Design. PWS Publishing, Boston, MA, USA, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. R. L. H. S. E. Haupt. Practical Genetic Algorithms. Wiley-Interscience, 2 edition, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Y. Jia and T. B. Culver. Bootstrapped artificial neural networks for synthetic flow generation with a small data sample. Journal of Hydrology, 331(3-4):580--590, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  16. K. N. Kasabov. Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering. MIT Press, Cambridge, MA, USA, 1st edition, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. S. Khakabimamaghani, F. Barzinpour, and M. R. Gholamian. A high diversity hybrid ensemble of classifiers. In 2010 2nd International Conference on Software Engineering and Data Mining (SEDM), pages 461--466, June 2010.Google ScholarGoogle Scholar
  18. Y. Liu and X. Yao. Ensemble learning via negative correlation. Neural Networks, 12:1399--1404, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Y. Liu, X. Yao, and T. Higuchi. Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation, 4(4):380--387, November 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. D. Nguyen and B. Widrow. Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. In IJCNN International Joint Conference on Neural Networks 1990, volume 3, pages 21--26, June 1990.Google ScholarGoogle ScholarCross RefCross Ref
  21. M. Re and G. Valentini. Ensemble methods: A review. In Data Mining and Machine Learning for Astronomical Applications, pages 1--40. Chapman & Hall, 2011. (in press).Google ScholarGoogle Scholar
  22. M. Ries, O. Nemethova, and M. Rupp. Performance evaluation of mobile video quality estimators. In Proceedings of the 15th European Signal Processing Conference, pages 159--163, September 2007.Google ScholarGoogle Scholar
  23. L. Torgo. Regression Datasets. Laboratory of Artificial Intelligence and Decision Support (LIAAD), University of Porto, 2011. http://www.liaad.up.pt/~ltorgo/Regression/DataSets.html.Google ScholarGoogle Scholar
  24. J. Torres-Sospedra, M. Fernandez-Redondo, and C. Hernandez-Espinosa. A research on combination methods for ensembles of multilayer feedforward. In IEEE International Joint Conference on Neural Networks, 2005. IJCNN '05. Proceedings 2005, volume 2, pages 1125--1130, July-August 2005.Google ScholarGoogle ScholarCross RefCross Ref
  25. J. Škutová. Weights initialization methods for mlp neural networks. Transactions of the VŠB - Technical University of Ostrova, Mechanical Series, LIV(2):147--152, 2008.Google ScholarGoogle Scholar
  26. T. Yu-Bo and X. Zhi-Bin. Particle-swarm-optimization-based selective neural network ensemble and its application to modeling resonant frequency of microstrip antenna. In Microstrip Antennas, pages 69--82. In Tech, April 2011.Google ScholarGoogle ScholarCross RefCross Ref
  27. Z.-H. Zhou, J. Wu, and W. Tang. Ensembling neural networks: Many could be better than all. Artificial Intelligence, 137(1-2):239--263, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A genetic algorithm for designing neural network ensembles

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        GECCO '12: Proceedings of the 14th annual conference on Genetic and evolutionary computation
        July 2012
        1396 pages
        ISBN:9781450311779
        DOI:10.1145/2330163

        Copyright © 2012 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 7 July 2012

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate1,669of4,410submissions,38%

        Upcoming Conference

        GECCO '24
        Genetic and Evolutionary Computation Conference
        July 14 - 18, 2024
        Melbourne , VIC , Australia

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader