Abstract
Recently, multiple classifier systems (MCS) have been used for practical applications to improve classification accuracy. Self-generating neural networks (SGNN) are one of the suitable base-classifiers for MCS because of their simple setting and fast learning. However, the computation cost of the MCS increases in proportion to the number of SGNN. In this paper, we propose a novel optimization method for the structure of the SGNN in the MCS. We compare the optimized MCS with two sampling methods. Experiments have been conducted to compare the optimized MCS with an unoptimized MCS, the MCS based on C4.5, and k-nearest neighbor. The results show that the optimized MCS can improve its classification accuracy as well as reducing the computation cost.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
J. Han and M. Kamber. Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers, San Francisco, CA, 2000.
J. R. Quinlan. Bagging, Boosting, and C4.5. In Proceedings of the Thirteenth National Conference on Artificial Intelligence, pages 725–730, Portland, OR, Aug. 4–8 1996.
G. Rätsch, T. Onoda, and K.-R. Müller. Soft margins for AdaBoost. Machine Learning, 42(3):287–320, March 2001.
H. Mamitsuka and N. Abe. Efficient mining from large databases with query learning. In Proceedings of the 16th International Conference on Machine Learning, pages 575–582, 2000.
C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press, New York, 1995.
R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification. John Wiley & Sons Inc., New York, 2nd ed., 2000.
W. X. Wen, A. Jennings, and H. Liu. Learning a neural tree. In International Joint Conference on Neural Networks, Beijing, China, 1992. This paper available at ftp://ftp.cis.ohio-state.edu/pub/neuroprose/wen.sgnt-learn.ps.Z.
T. Kohonen. Self-Organizing Maps. Springer-Verlag, Berlin, 1995.
H. Inoue and H. Narihisa. Improving generalization ability of self-generating neural networks through ensemble averaging. In Takao Terano, Huan Liu, and Arbee L P Chen, editors, Knowledge Discovery and Data Mining: Current issues and new applications, volume 1805 of LNAI, pages 177–180, Springer-Verlag, Berlin, 2000.
M. Stone. Cross-validation: A review. Math. Operationsforsch. Statist., Ser. Statistics, 9(1):127–139, 1978.
L. Breiman. Bagging predictors. Machine Learning, 24:123–140, 1996.
J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, USA, 1993.
C.L. Blake and C.J. Merz. UCI repository of machine learning databases, University of California, Irvine, Dept. of Information and Computer Sciences, 1998. Datasets available at http://www.ics.uci.edu/~mlearn/MLRepository.html.
E. A. Patrick and F. P. Fischer. A generalized k-nearest neighbor rule. Information and Control, 16(2):128–152, 1970.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Inoue, H., Narihisa, H. (2002). Optimizing a Multiple Classifier System. In: Ishizuka, M., Sattar, A. (eds) PRICAI 2002: Trends in Artificial Intelligence. PRICAI 2002. Lecture Notes in Computer Science(), vol 2417. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45683-X_32
Download citation
DOI: https://doi.org/10.1007/3-540-45683-X_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44038-3
Online ISBN: 978-3-540-45683-4
eBook Packages: Springer Book Archive