ABSTRACT
Ensemble Methods (EMs) are sets of models that combine their decisions, or their learning algorithms, or different data to obtain good predictions. The motivations are the possibility of improving the generalization capability and the overall system performance. However, several issues are at stake in EM development, such as the design of models that disagree as much as possible on the same data, the selection of some of them and their optimal combination to enhance the robustness of the ensemble. Since there is no unified procedure to implement these steps, this paper proposes a new methodology to design Neural Network (NN) ensembles using a Genetic Algorithm (GA). Firstly, a set of NNs with high degree of diversity is produced. The aim is to draw a different training data set for each NN by applying bootstrap. The architecture of the NN is selected by varying the number of hidden neurons, activation functions and initialization of weights. Secondly, a GA is employed to select both the best subset of NNs and the optimal combination strategy for ensuring the accuracy and the robustness of the ensemble. Experiments on well-known data sets are reported to evaluate the effectiveness of the proposed methodology.
- D. A. Belsley, E. Kuh, and R. E. Welsch. Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. Wiley, 1980.Google ScholarCross Ref
- L. Breiman. Bagging predictors. Machine Learning, 24(2):123--140, August 1996. Google ScholarCross Ref
- G. Brown, J. Wyatt, R. Harris, and X. Yao. Diversity creation methods: A survey and categorisation. Information Fusion, 6(1):5--20, 2005.Google ScholarCross Ref
- P. A. D. Castro and F. J. V. Zuben. Learning ensembles of neural networks by means of a bayesian artificial immune system. IEEE Transactions on Neural Networks, 22(2):304--316, February 2011. Google ScholarDigital Library
- A. Chandra, H. Chen, and X. Yao. Trade-off between diversity and accuracy in ensemble generation. In Y. Jin, editor, Multi-Objective Machine Learning, volume 16 of Studies in Computational Intelligence, pages 429--464. Springer Berlin / Heidelberg, 2006.Google Scholar
- A. L. V. Coelho and D. S. C. Nascimento. On the evolutionary design of heterogeneous bagging models. Neurocomputing, 73(16-18):3319--3322, 2010. Google ScholarDigital Library
- T. G. Dietterich. Ensemble methods in machine learning. In J. Kittler and F. Roli, editors, Proceedings of the First International Workshop on (MCS 2000) Multiple Classifier Systems, volume 1857, pages 1--15. Springer-Verlag, 2000. Google ScholarDigital Library
- S. Dondeti, K. Kannan, and R. Manavalan. Genetic algorithm optimized neural networks ensemble for estimation of mefenamic acid and paracetamol in tablets. Acta Chimica Slovenica, 52:440--449, 2005.Google Scholar
- B. Efron and R. J. Tibshirani. An Introduction to the Bootstrap. Chapman & Hall, New York, 1993.Google ScholarCross Ref
- L. Fortuna, S. Graziani, and M. G. Xibilia. Comparison of soft-sensor design methods for industrial plants using small data sets. IEEE Transactions on Instrumentation and Measurement, 58(8):2444--2451, August 2009.Google ScholarCross Ref
- Y. Freund and R. Schapire. A short introduction to boosting. Japanese Society for Artificial Intelligence, 14(5):771--780, 1999.Google Scholar
- J. H. Friedman. Multivariate adaptive regression splines. The Annals of Statistics, 19(1):1--67, 1991.Google ScholarCross Ref
- M. T. Hagan, H. B. Demuth, and M. Beale. Neural Network Design. PWS Publishing, Boston, MA, USA, 1996. Google ScholarDigital Library
- R. L. H. S. E. Haupt. Practical Genetic Algorithms. Wiley-Interscience, 2 edition, 2004. Google ScholarDigital Library
- Y. Jia and T. B. Culver. Bootstrapped artificial neural networks for synthetic flow generation with a small data sample. Journal of Hydrology, 331(3-4):580--590, 2006.Google ScholarCross Ref
- K. N. Kasabov. Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering. MIT Press, Cambridge, MA, USA, 1st edition, 1996. Google ScholarDigital Library
- S. Khakabimamaghani, F. Barzinpour, and M. R. Gholamian. A high diversity hybrid ensemble of classifiers. In 2010 2nd International Conference on Software Engineering and Data Mining (SEDM), pages 461--466, June 2010.Google Scholar
- Y. Liu and X. Yao. Ensemble learning via negative correlation. Neural Networks, 12:1399--1404, 1999. Google ScholarDigital Library
- Y. Liu, X. Yao, and T. Higuchi. Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation, 4(4):380--387, November 2000. Google ScholarDigital Library
- D. Nguyen and B. Widrow. Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. In IJCNN International Joint Conference on Neural Networks 1990, volume 3, pages 21--26, June 1990.Google ScholarCross Ref
- M. Re and G. Valentini. Ensemble methods: A review. In Data Mining and Machine Learning for Astronomical Applications, pages 1--40. Chapman & Hall, 2011. (in press).Google Scholar
- M. Ries, O. Nemethova, and M. Rupp. Performance evaluation of mobile video quality estimators. In Proceedings of the 15th European Signal Processing Conference, pages 159--163, September 2007.Google Scholar
- L. Torgo. Regression Datasets. Laboratory of Artificial Intelligence and Decision Support (LIAAD), University of Porto, 2011. http://www.liaad.up.pt/~ltorgo/Regression/DataSets.html.Google Scholar
- J. Torres-Sospedra, M. Fernandez-Redondo, and C. Hernandez-Espinosa. A research on combination methods for ensembles of multilayer feedforward. In IEEE International Joint Conference on Neural Networks, 2005. IJCNN '05. Proceedings 2005, volume 2, pages 1125--1130, July-August 2005.Google ScholarCross Ref
- J. Škutová. Weights initialization methods for mlp neural networks. Transactions of the VŠB - Technical University of Ostrova, Mechanical Series, LIV(2):147--152, 2008.Google Scholar
- T. Yu-Bo and X. Zhi-Bin. Particle-swarm-optimization-based selective neural network ensemble and its application to modeling resonant frequency of microstrip antenna. In Microstrip Antennas, pages 69--82. In Tech, April 2011.Google ScholarCross Ref
- Z.-H. Zhou, J. Wu, and W. Tang. Ensembling neural networks: Many could be better than all. Artificial Intelligence, 137(1-2):239--263, 2002. Google ScholarDigital Library
Index Terms
- A genetic algorithm for designing neural network ensembles
Recommendations
Class-switching neural network ensembles
This article investigates the properties of class-switching ensembles composed of neural networks and compares them to class-switching ensembles of decision trees and to standard ensemble learning methods, such as bagging and boosting. In a class-...
Neural network crossover in genetic algorithms using genetic programming
AbstractThe use of genetic algorithms (GAs) to evolve neural network (NN) weights has risen in popularity in recent years, particularly when used together with gradient descent as a mutation operator. However, crossover operators are often omitted from ...
Biological engineering applications of feedforward neural networks designed and parameterized by genetic algorithms
Two neural network (NN) applications in the field of biological engineering are developed, designed and parameterized by an evolutionary method based on the evolutionary process of genetic algorithms. The developed systems are a fault detection NN model ...
Comments