Skip to main content

Combination of Multiple Nearest Neighbor Classifiers Based on Feature Subset Clustering Method

  • Conference paper
Advances in Machine Learning and Cybernetics

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3930))

  • 1082 Accesses

Abstract

This paper proposes a new method called FC-MNNC based on feature subset clustering for combining multiple NNCs to obtain better performance than that of using a single NNC. In FC-MNNC, the component NNCs based on the reasonably partitioned feature subsets are parallel and independently able to classify one pattern and the final decision is aggregated by the majority voting rule. Here, two methods are used to partition the feature set. In method I, GA is used for clustering features to form different feature subsets according to the accuracy of the combination classification. And method II is the transitive closure clustering method based on the pair-wise correlation between features. To demonstrate the performance of FC-MNNC, we select four UCI databases for our experiments. The experimental results show that: (i) in FC-MNNC, the performance of method II isn’t better than that of method I; (ii) the accuracy of FC-MNNC based on method I is better than that of the standard NNC and feature selection using GA in individual classifier; (iii) the performance of FC-MNNC based on method I is not worse than that of feature subset selection using GA in multiple NNCs; and (iv) FC-MNNC is robust against irrelevant features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cho, S.-B., Kim, J.H.: Combining multiple neural networks by fuzzy integral for robust classification. IEEE trans. on SMC 25, 380–384 (1995)

    Google Scholar 

  2. Cho, S.-B., Kim, J.H.: Multiple network fusion using fuzzy logic. IEEE trans. on NN 6, 497–501 (1995)

    Google Scholar 

  3. Tumer, K., Ghosh, J.: Error correction and error reduction in ensemble classifiers. Connection science 8(314), 385–404 (1996)

    Article  Google Scholar 

  4. Kittler, J., Hojjatoleslami, A., windeatt, T.: Strategies for combining classifiers employing shared and distinct representations. Pattern recognition letters 18, 1373–1377 (1997)

    Article  Google Scholar 

  5. Bay, S.D.: Nearest neighbor classifiers from multiple feature subsets. Intelligent data analysis 3, 191–209 (1999)

    Article  Google Scholar 

  6. Breiman, L.: Bagging predictors. Machine learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  7. Langley, P., Iba, W.: Average–case analysis of a nearest neighbor algorithm. In: Proc. of the thirteenth international joint conference on artificial intelligence, pp. 889–894 (1993)

    Google Scholar 

  8. Vishwath, P., Murty, M.N., Bhatnagar, C.: Fusion of multiple approximate nearest neighbor classifier for fast and efficient classification. Information fusion 5, 239–250 (2004)

    Article  Google Scholar 

  9. Kuncheva, L.I., Jain, L.C.: Designing classifier fusion systems by genetic algorithms. IEEE trans. on evolutionary computation 4, 327–336 (2000)

    Article  Google Scholar 

  10. Lam, L., Suen, C.Y.: Application of majority voting to pattern recognition: An analysis of its behavior and performance. IEEE trans. on SMC 27, 553–568 (1997)

    Google Scholar 

  11. Mitchell, T.M.: Machine learning. China Machine Press, Beijing (2003)

    Google Scholar 

  12. UCI repository of machine learning databases and domain theories. FTP address, http://www.ics.uci.edu/~mlearn

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, LJ., Hua, Q., Wang, XL., Chen, QC. (2006). Combination of Multiple Nearest Neighbor Classifiers Based on Feature Subset Clustering Method. In: Yeung, D.S., Liu, ZQ., Wang, XZ., Yan, H. (eds) Advances in Machine Learning and Cybernetics. Lecture Notes in Computer Science(), vol 3930. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11739685_56

Download citation

  • DOI: https://doi.org/10.1007/11739685_56

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-33584-9

  • Online ISBN: 978-3-540-33585-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics