Skip to main content

Combining Fisher Linear Discriminants for Dissimilarity Representations

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1857))

Abstract

Investigating a data set of the critical size makes a classification task difficult. Studying dissimilarity data refers to such a problem, since the number of samples equals their dimensionality. In such a case, a simple classifier is expected to generalize better than the complex one. Earlier experiments [9,3] confirm that in fact linear decision rules perform reasonably well on dissimilarity representations. For the Pseudo-Fisher linear discriminant the situation considered is the most inconvenient since the generalization error approaches its maximum when the size of a learning set equals the dimensionality [10]. However, some improvement is still possible. Combined classifiers may handle this problem better when a more powerful decision rule is found. In this paper, the usefulness of bagging and boosting of the Fisher linear discriminant for dissimilarity data is discussed and a new method based on random subspaces is proposed. This technique yields only a single linear pattern recognizer in the end and still significantly improves the accuracy.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.

    MATH  MathSciNet  Google Scholar 

  2. C. Cortes and V. Vapnik. Support-vector networks. Machine Learning, 20:273–297, 1995.

    MATH  Google Scholar 

  3. R. P. W. Duin, E. Pekalska, and D. de Ridder. Relational discriminant analysis. Pattern Recognition Letters, 20(11–13):1175–1181, 1999.

    Article  Google Scholar 

  4. Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proc. of the 13th International Conference, pages 148–156, 1996.

    Google Scholar 

  5. K. Fukunaga. Introduction to Statistical Pattern Recognition. Acad. Press, 1990.

    Google Scholar 

  6. T. K. Ho. Nearest neighbours in random subspaces. In Proceedings of the Second International Workshop on Statistical Techniques in Pattern Recognition, pages 640–648, Sydney (Australia), 1998.

    Google Scholar 

  7. T. K. Ho. The random subspace method for constructing decision forest. IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(8):832–844, 1998.

    Article  Google Scholar 

  8. A. K. Jain and B. Chandrasekaran. Dimensionality and sample size considerations in pattern recognition practice. In P. R. Krishnaiah and L. N. Kanal, editors, Handbook of Statistics, volume 2, pages 835–855. North-Holland, Amsterdam, 1987.

    Google Scholar 

  9. E. Pękalska and R. P. W. Duin. Classifiers for dissimilarity-based pattern recognition. In ICPR, Barcelona (Spain), 2000, accepted.

    Google Scholar 

  10. S. Raudys and R. P. W. Duin. On expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix. Pattern Recognition Letters, 19(5–6), 1998.

    Google Scholar 

  11. M. Skurichina and R. P. W. Duin. Bagging for linear classifiers. Pattern Recognition, 31(7):909–930, 1998.

    Article  Google Scholar 

  12. M. Skurichina and R. P. W. Duin. Boosting in linear discriminant analysis. In First International Workshop on Multiple Classifier Systems, Cagliari (Italy), 2000.

    Google Scholar 

  13. C.L. Wilson and M.D. Garris. Handprinted character database 3. Technical report, National Institute of Standards and Technology, February 1992.

    Google Scholar 

  14. A. Ypma, D. M. J. Tax, and R. P. W. Duin. Robust machine fault detection with independent component analysis and support vector data description. In IEEE International Workshop on Neural Networks for Signal Processing, pages 67–76, Wisconsin (USA), 1999.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pękalska, E., Skurichina, M., Duin, R.P.W. (2000). Combining Fisher Linear Discriminants for Dissimilarity Representations. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_11

Download citation

  • DOI: https://doi.org/10.1007/3-540-45014-9_11

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67704-8

  • Online ISBN: 978-3-540-45014-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics