Abstract
We empirically evaluate several state-of-the-art methods for constructing ensembles of classifiers with stacking and show that they perform (at best) comparably to selecting the best classifier from the ensemble by cross validation. We then propose a new method for stacking, that uses multi-response model trees at the meta-level, and show that it outperforms existing stacking approaches, as well as selecting the best classifier from the ensemble by cross validation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
D. Aha, D.W. Kibler, and M. K. Albert. Instance-based learning algorithms. Machine Learning, 6:37–66, 1991.
C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998.
T. G. Dietterich. Machine-learning research: Four current directions. AI Magazine, 18(4):97–136, 1997.
T. G. Dietterich. Ensemble methods in machine learning. In Proceedings of the First International Workshop on Multiple Classifier Systems, pages 1–15, Berlin, 2000. Springer.
E. Frank, Y. Wang, S. Inglis, G. Holmes, and I. H. Witten. Using model trees for classification. Machine Learning, 32(1):63–76, 1998.
G. C. John and E. T. Leonard. K*: An instance-based learner using an entropic distance measure. In Proceedings of the 12th International Conference on Machine Learning, pages 108–114, San Francisco, 1995. Morgan Kaufmann.
G. H. John and P. Langley. Estimating continuous distributions in bayesian classifiers. In Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence, pages 338–345, San Francisco, 1995. Morgan Kaufmann.
R. Kohavi. The power of decision tables. In Proceedings of the Eighth European Conference on Machine Learning, pages 174–189, 1995.
C. J. Merz. Using correspondence analysis to combine classifiers. Machine Learning, 36(1/2):33–58, 1999.
J. R. Quinlan. Learning with continuous classes. In Proceedings of the Fifth Australian Joint Conference on Artificial Intelligence, pages 343–348, Singapore, 1992. World Scientific.
J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco, 1993.
A. K. Seewald and J. Fürnkranz. An evaluation of grading classifiers. In Advances in Intelligent Data Analysis: Proceedings of the Fourth International Symposium (IDA-01), pages 221–232, Berlin, 2001. Springer.
K. M. Ting and I. H. Witten. Issues in stacked generalization. Journal of Artificial Intelligence Research, 10:271–289, 1999.
L. Todorovski and S. Džeroski. Combining multiple models with meta decision trees. In Proceedings of the Fourth European Conference on Principles of Data Mining and Knowledge Discovery, pages 54–64, Berlin, 2000. Springer.
L. Todorovski and S. Džeroski. Combining classifiers with meta decision trees. Machine Learning, In press, 2002.
Y. Wang and I. H. Witten. Induction of model trees for predicting continuous classes. In Proceedings of the Poster Papers of the European Conference on Machine Learning, Prague, 1997. University of Economics, Faculty of Informatics and Statistics.
I. H. Witten and E. Frank. Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Francisco, 1999.
D. Wolpert. Stacked generalization. Neural Networks, 5(2):241–260, 1992.
B. Ženko, L. Todorovski, and S. Džeroski. A comparison of stacking with mdts to bagging, boosting, and other stacking methods. In Proceedings of the First IEEE International Conference on Data Mining, pages 669–670, Los Alamitos, 2001. IEEE Computer Society.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Džeroski, S., Ženko, B. (2002). Stacking with Multi-response Model Trees. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_20
Download citation
DOI: https://doi.org/10.1007/3-540-45428-4_20
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43818-2
Online ISBN: 978-3-540-45428-1
eBook Packages: Springer Book Archive