Skip to main content
Log in

Uncertainty evaluation and model selection of extreme learning machine based on Riemannian metric

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Considering the uncertainty of hidden neurons, choosing significant hidden nodes, called as model selection, has played an important role in the applications of extreme learning machines(ELMs). How to define and measure this uncertainty is a key issue of model selection for ELM. From the information geometry point of view, this paper presents a new model selection method of ELM for regression problems based on Riemannian metric. First, this paper proves theoretically that the uncertainty can be characterized by a form of Riemannian metric. As a result, a new uncertainty evaluation of ELM is proposed through averaging the Riemannian metric of all hidden neurons. Finally, the hidden nodes are added to the network one by one, and at each step, a multi-objective optimization algorithm is used to select optimal input weights by minimizing this uncertainty evaluation and the norm of output weight simultaneously in order to obtain better generalization performance. Experiments on five UCI regression data sets and cylindrical shell vibration data set are conducted, demonstrating that the proposed method can generally obtain lower generalization error than the original ELM, evolutionary ELM, ELM with model selection, and multi-dimensional support vector machine. Moreover, the proposed algorithm generally needs less hidden neurons and computational time than the traditional approaches, which is very favorable in engineering applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of 2004 international joint conference on neural networks (IJCNN’2004), Budapest, Hungary, pp 985–990

  2. Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B 42:513–529

    Article  Google Scholar 

  3. Huang GB, Zhu X, Siew C et al (2006) Extreme learning machine: theory and applications. Neurocomputing. 70:489–501

    Article  Google Scholar 

  4. Chen FL, Ou TY (2011) Sales forecasting system based on gray extreme learning machine with taguchi method in retail industry. Expert Syst Appl 38:1336–1345

    Article  Google Scholar 

  5. Minhas R, Baradarani A, Seifzadeh S, Wu QMJ (2010) Human action recognition using extreme learning machine based on visual vocabularies. Neurocomputing 73:1906–1917

    Article  Google Scholar 

  6. Mohammed A, Wu QMJ, Sid-Ahmed M (2010) Application of wave atoms decomposition and extreme learning machine for fingerprint classification. Lect Notes Comput Sci 6112:246–256

    Article  Google Scholar 

  7. Huang GB, Wang D, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Lean Cybern 2:107–122

    Article  Google Scholar 

  8. Feng G, Huang GB, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20:1352–1357

    Article  Google Scholar 

  9. Lan Y, Soh YC, Huang GB (2010) Random search enhancement of error minimized extreme learing machine. In: Proceedings of European symposium on artificial neural networks (ESANN 2010), Bruges, Belgium, pp 327–332

  10. Li K, Huang GB, Ge SS (2010) Fast construction of single hidden layer feedforward networks. In: Rozenberg G, Bäck T, Kok JN (eds) Handbook of natural computing. Springer, Berlin

    Google Scholar 

  11. Zhu QY, Qin AK, Suganthan PN, Huang GB (2005) Evolutionary extreme learning machine. Pattern Recogn 38:1759–1763

    Article  MATH  Google Scholar 

  12. Lan Y, Soh YC, Huang GB (2010) Two-stage extreme learning machine for regression. Neurocomputing 73:3028–3038

    Article  Google Scholar 

  13. Mao W, Tian M, Cao X, Xu J (2013) Model selection of extreme learning machine based on multi-objective optimization. Neural Comput Appl 22(3):521–529

    Article  Google Scholar 

  14. Amari S, Wu S (1999) Improving support vector machine classifiers by modifying Kernel functions. Neural Netw 12:783–789

    Article  Google Scholar 

  15. Huang GB, Babri HA (1998) Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Trans Neural Netw 9:224–229

    Article  Google Scholar 

  16. Burges CJC (1998) Geometry and invariance in Kernel based methods. In: Schölkopf B, Burges CJC, Smola AJ (eds) Advances in Kernel methods: support vector learning. MIT Press, Cambridge, pp 89–116

    Google Scholar 

  17. Huang GB, Ding X, Zhou H (2010) Optimization method based extreme learning machine for classification. Neurocomputing 74:155–163

    Article  Google Scholar 

  18. Huang VL, Suganthan PN, Liang JJ (2006) Comprehensive learning particle swarm optimizer for solving multiobjective optimization problems. Int J Intell Syst 21: 209–226

    Article  MATH  Google Scholar 

  19. Newman DJ, Hettich S, Blake CL, Merz CJ (1998) UCI Repository of machine learning databases. Department of Information and Computer Science, University of California, Irvine, CA

  20. Mao W, Yan G, Dong L (2012) A novel machine learning based method of combined dynamic environment prediction. J Sound Vib (under review)

  21. Sánchez-fernández M, De-prado-cumplido M, Arenas-garcía J, Pérez-Cruz F (2004) SVM multiregression for nonlinear channel estimation in multiple-input multiple-output systems. IEEE Trans Signal Process 52:2298–2307

    Article  MathSciNet  Google Scholar 

  22. Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of 14th international conference on artificial intelligence (IJCAI), pp 1137–1143

Download references

Acknowledgments

We thank the author Suganthan of [18] for the implementation of MOCLPSO. This work was supported by National Natural Science Foundation of China (NO.U1204609,60873104) and Key Scientific and Technological Project of Henan Province, China (No. 122102210086).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wentao Mao.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mao, W., Zheng, Y., Mu, X. et al. Uncertainty evaluation and model selection of extreme learning machine based on Riemannian metric. Neural Comput & Applic 24, 1613–1625 (2014). https://doi.org/10.1007/s00521-013-1392-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-013-1392-0

Keywords

Navigation