Abstract
Investigating a data set of the critical size makes a classification task difficult. Studying dissimilarity data refers to such a problem, since the number of samples equals their dimensionality. In such a case, a simple classifier is expected to generalize better than the complex one. Earlier experiments [9,3] confirm that in fact linear decision rules perform reasonably well on dissimilarity representations. For the Pseudo-Fisher linear discriminant the situation considered is the most inconvenient since the generalization error approaches its maximum when the size of a learning set equals the dimensionality [10]. However, some improvement is still possible. Combined classifiers may handle this problem better when a more powerful decision rule is found. In this paper, the usefulness of bagging and boosting of the Fisher linear discriminant for dissimilarity data is discussed and a new method based on random subspaces is proposed. This technique yields only a single linear pattern recognizer in the end and still significantly improves the accuracy.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.
C. Cortes and V. Vapnik. Support-vector networks. Machine Learning, 20:273–297, 1995.
R. P. W. Duin, E. Pekalska, and D. de Ridder. Relational discriminant analysis. Pattern Recognition Letters, 20(11–13):1175–1181, 1999.
Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proc. of the 13th International Conference, pages 148–156, 1996.
K. Fukunaga. Introduction to Statistical Pattern Recognition. Acad. Press, 1990.
T. K. Ho. Nearest neighbours in random subspaces. In Proceedings of the Second International Workshop on Statistical Techniques in Pattern Recognition, pages 640–648, Sydney (Australia), 1998.
T. K. Ho. The random subspace method for constructing decision forest. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8):832–844, 1998.
A. K. Jain and B. Chandrasekaran. Dimensionality and sample size considerations in pattern recognition practice. In P. R. Krishnaiah and L. N. Kanal, editors, Handbook of Statistics, volume 2, pages 835–855. North-Holland, Amsterdam, 1987.
E. Pękalska and R. P. W. Duin. Classifiers for dissimilarity-based pattern recognition. In ICPR, Barcelona (Spain), 2000, accepted.
S. Raudys and R. P. W. Duin. On expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix. Pattern Recognition Letters, 19(5–6), 1998.
M. Skurichina and R. P. W. Duin. Bagging for linear classifiers. Pattern Recognition, 31(7):909–930, 1998.
M. Skurichina and R. P. W. Duin. Boosting in linear discriminant analysis. In First International Workshop on Multiple Classifier Systems, Cagliari (Italy), 2000.
C.L. Wilson and M.D. Garris. Handprinted character database 3. Technical report, National Institute of Standards and Technology, February 1992.
A. Ypma, D. M. J. Tax, and R. P. W. Duin. Robust machine fault detection with independent component analysis and support vector data description. In IEEE International Workshop on Neural Networks for Signal Processing, pages 67–76, Wisconsin (USA), 1999.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pękalska, E., Skurichina, M., Duin, R.P.W. (2000). Combining Fisher Linear Discriminants for Dissimilarity Representations. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_11
Download citation
DOI: https://doi.org/10.1007/3-540-45014-9_11
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67704-8
Online ISBN: 978-3-540-45014-6
eBook Packages: Springer Book Archive