Abstract
We consider the problem of online classification in nonstationary environments. Specifically, we take a Bayesian approach to sequential parameter estimation of a logistic MCS, and compare this method with other algorithms for nonstationary classification. We comment on several design considerations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Breiman, L.: Bagging predictors. Machine Learning 26, 123–140 (1996)
Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)
Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)
Doucet, A., Godshill, S., Andrieu, C.: On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing 10, 197–208 (2000)
de Freitas, J.F.G., Niranjan, M., Gee, A.H.: Hierarchical Bayesian models for regularization in sequential learning. Neural Computation 12, 933–953 (2000)
Højen-Sørensen, P., de Freitas, N., Fog, T.: On-line probabilistic classification with particle filters. Neural Networks for Signal Processing X, 2000. In: Proceedings of the 2000 IEEE Signal Processing Society Workshop, vol. 1, pp. 386–395 (2000), citeseer.ist.psu.edu/322567 .html
Kuncheva, L.I.: Classifier ensembles for changing environments. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 1–15. Springer, Heidelberg (2004)
Kuncheva, L.I., Plumpton, C.O.: Adaptive learning rate for online linear discriminant classifiers. In: da Vitoria Lobo, N., Kasparis, T., Roli, F., Kwok, J.T., Georgiopoulos, M., Anagnostopoulos, G.C., Loog, M. (eds.) S+SSPR 2008. LNCS, vol. 5342, pp. 510–519. Springer, Heidelberg (2008)
Littlestone, N.: Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning 2, 285–318 (1998)
McCormick, T.H., Raftery, A.E., Madigan, D., Burd, R.S.: Dynamic logistic regression and dynamic model averaging for binary classification (submitted)
Muhlbaier, M.D., Polikar, R.: An ensemble approach for incremental learning in nonstationary environments. In: Haindl, M., Kittler, J., Roli, F. (eds.) MCS 2007. LNCS, vol. 4472, pp. 490–500. Springer, Heidelberg (2007)
Penny, W.D., Roberts, S.J.: Dynamic logistic regression. In: International Joint Conference on Neural Networks, IJCNN 1999, vol. 3, pp. 1562–1567 (1999)
Schapire, R.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)
Tomas, A.: A Dynamic Logistic Model for Combining Classifier Outputs. Ph.D. thesis, The University of Oxford (2009)
West, M., Harrison, J.: Bayesian Forecasting and Dynamic Models, 2nd edn. Springer Series in Statistics. Springer, Heidelberg (1997)
West, M., Harrison, P.J., Migon, H.S.: Dynamic generalized linear models and Bayesian forecasting. Journal of the American Statistical Association 80(389), 73–83 (1985)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tomas, A. (2011). A Dynamic Logistic Multiple Classifier System for Online Classification. In: Sansone, C., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2011. Lecture Notes in Computer Science, vol 6713. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21557-5_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-21557-5_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21556-8
Online ISBN: 978-3-642-21557-5
eBook Packages: Computer ScienceComputer Science (R0)