Skip to main content
Log in

Fusion trees for fast and accurate classification of hyperspectral data with ensembles of \(\gamma\)-divergence-based RBF networks

  • Advances in Intelligent Data Processing and Analysis
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Ensembles of RBF networks trained with \(\gamma\)-divergence-based similarity measures can improve classification accuracy of hyperspectral imaging data significantly compared to any single RBF network as well as to RBF ensembles based on the Euclidian distance. So far, the drawback of using classifier ensembles is the need to compute the results of a typically large number of RBF networks prior to combination. In this paper, a modified approach to the fusion of classifier outputs is proposed which is based on decision trees. It is shown for several real-world datasets that a small subset of the RBF networks contributes to the decisions in the average case. Hence, for any decision, a conditional computation of required RBF network outputs yields a significant decrease in the computational costs. Additionally, a selection scheme for subsets of RBF classifiers based on their relevance in the fusion process is proposed. This alternative approach can be used, if analysis requires fixed settings, e.g., to meet time constraints.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Aggarwal CC, Hinneburg A, Keim DA (2001) On the surprising behavior of distance metrics in high dimensional space. In: den Bussche JV, Vianu V (eds) Lecture notes in computer science, vol 1973. Springer, Berlin, Heidelberg, pp 420–434

  2. Al-Ani A, Deriche M (2002) A new technique for combining multiple classifiers using the Dempster-Shafer theory of evidence. J Artif Intell Res 17:333–361

    MATH  MathSciNet  Google Scholar 

  3. Backhaus A, Bollenbeck F, Seiffert U (2011) Robust classification of the nutrition state in crop plants by hyperspectral imaging and artificial neural networks. In: Proceedings of the 3rd workshop on hyperspectral image and signal processing: evolution in remote sensing. Lisboa, Portugal

  4. Breiman L (2001) Random forests. Mach Learn 45:5–32

    Article  MATH  Google Scholar 

  5. Chen X, Li Y, Harrison R, Zhang YQ (2008) Type-2 fuzzy logic-based classifier fusion for support vector machines. Appl Soft Comput 8(3):1222–1231. doi:10.1016/j.asoc.2007.02.019

    Article  Google Scholar 

  6. Didaci L, Fumera G, Roli F (2013) Diversity in classifier ensembles: fertile concept or dead end? In: Zhou Z-H, Roli F, Kittler J (eds) Multiple classifier systems. Lecture notes in computer science, vol 7872. Springer, Berlin, Heidelberg, pp 37–48

  7. Duin RPW (2002) The combining classifier: to train or not to train? In: Kasturi R, Laurendeau D, Suen C (eds) International Conference on Pattern Recognition, pp 765–770

  8. Freund Y, Schapire RE (1995) A decision-theoretic generalization of on-line learning and an application to boosting. In: European conference on computational learning theory, pp 23–37

  9. Geweniger T, Kästner M, Villmann T (2011) Optimization of parametrized divergences in fuzzy c-means. In: 19th European symposium on artificial neural networks (ESANN 2011), pp 11–16

  10. Hammer B, Strickert M, Villmann T (2005) Supervised neural gas with general similarity measure. Neural Process Lett 21:21–44. doi:10.1007/s11063-004-3255-2

    Article  Google Scholar 

  11. Hammer B, Villmann T (2002) Generalized relevance learning vector quantization. Neural Netw 15:1059–1068

    Article  Google Scholar 

  12. Kang S, Park S (2009) A fusion neural network classifier for image classification. Pattern Recogn Lett 30(9):789–793. doi:10.1016/j.patrec.2008.06.009

    Article  MathSciNet  Google Scholar 

  13. Kästner M, Backhaus A, Geweniger T, Haase S, Seiffert U, Vilmman T (2011) Relevance learning in unsupervised vector quantization based on divergences. In: Advances in self-organizing maps, 8th international workshop, WSOM 2011, LNCS, vol 6731, pp 90–100

  14. Khreich W, Granger E, Miri A, Sabourin R (2010) Iterative Boolean combination of classifiers in the ROC space: an application to anomaly detection with HMMs. Pattern Recogn 43(8):2732–2752

    Article  MATH  Google Scholar 

  15. Kittler J, Hatef M, Duin RPW, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239

    Article  Google Scholar 

  16. Knauer U, Backhaus A, Seiffert U (2014) Beyond standard metrics—on the selection and combination of distance metrics for an improved classification of hyperspectral data. In: Workshop on self-organizing maps (WSOM 2014), Advances in intelligent systems and computing

  17. Knauer U, Seiffert U (2013) A comparison of late fusion methods for object detection. In: IEEE International conference on image processing (ICIP 2013), pp 3297–3301

  18. Moody J, Darken CJ (1989) Fast learning in networks of locally tuned processing units. Neural Comput 1:281–294

    Article  Google Scholar 

  19. Mwebaze E, Schneider P, Schleif FM, Haase S, Villmann T, Biehl M (2010) Divergence based learning vector quantization. In: Verleysen M (ed) 18th European symposium on artificial neural networks (ESANN 2010). d-side publishing, pp 247–252

  20. Peltonen J, Klami A, Kaski S (2002) Learning more accurate metrics for self-organizing maps. In: Dorronsoro J (ed) Artificial neural networks ICANN 2002, lecture notes in computer science, vol 2415. Springer, Berlin, pp 999–1004

    Chapter  Google Scholar 

  21. Poski P, Zaremba K (2012) Improving performance of self-organising maps with distance metric learning method. In: Rutkowski L, Korytkowski M, Scherer R, Tadeusiewicz R, Zadeh L, Zurada J (eds) Artificial intelligence and soft computing, lecture notes in computer science, vol 7267. Springer, Berlin, pp 169–177

    Chapter  Google Scholar 

  22. Quinlan JR (1987) Simplifying decision trees. Int J Man Mach Stud 27(3):221–248

    Article  Google Scholar 

  23. Schneider P, Schleif FM, Villmann T, Biehl M (2008) Generalized matrix learning vector quantizer for the analysis of spectral data. In: ESANN, pp 451–456

  24. Villmann T, Haase S (2010) Divergence based vector quantization of spectral data. In: Hyperspectral image and signal processing: evolution in remote sensing (WHISPERS), 2nd workshop on 2010, pp 1–4

  25. Villmann T, Haase S (2011) Divergence-based vector quantization. Neural Comput 23(5):1343–1392

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Uwe Knauer.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Knauer, U., Backhaus, A. & Seiffert, U. Fusion trees for fast and accurate classification of hyperspectral data with ensembles of \(\gamma\)-divergence-based RBF networks. Neural Comput & Applic 26, 253–262 (2015). https://doi.org/10.1007/s00521-014-1634-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-014-1634-9

Keywords

Navigation