Abstract
Liao et al. (Neurocomputing 128:81–87, 2014) proposed a meta-learning approach to extreme learning machine (Meta-ELM), which can obtain good generalization performance by training multiple ELMs. However, one of its open problems is overfitting when minimizing training error. In this paper, we propose an improved meta-learning model of ELM (improved Meta-ELM) to handle the problem. The improved Meta-ELM architecture is composed of some base ELMs which are error feedback incremental extreme learning machine (EFI-ELM) and the top ELM. The improved Meta-ELM includes two stages. First, each base ELM with EFI-ELM is trained on a subset of training data. Then, the top ELM learns with the base ELMs as hidden nodes. Simulation results on some artificial and benchmark datasets show that the proposed improved Meta-ELM model is more feasible and effective than Meta-ELM.
Similar content being viewed by others
References
Liu HP, Liu YH, Sun FC (2015) Robust exemplar extraction using structured sparse coding. IEEE Trans Neural Netw Learn Syst 26(8):1816–1821
Liu HP, Qin J, Sun FC, Di Guo Extreme kernel sparse learning for tactile object recognition, IEEE Trans Cybern, In press
Yang YM, Jonathan Wu QM (2009) Mutilayer extreme learning machine with subnetwork nodes for representation learning. IEEE Trans Cybern 20(8):1352–1357
Cao J, Zhao Y, Lai X, Ong MEH, Yin C, Koh Z, Liu N (2015) Landmark recognition with sparse representation classification and extreme learning machine. J Frankl Inst 352(10):4528–4545
Huang GB, Bai Z, Kasun LLC, Vong CM (2015) Local receptive fields based extreme learning machine. IEEE Comput Intell Mag 10(2):18–29
Guo D, Zhang Y, Xiao Z, Mao M, Liu J (2015) Common nature of learning between Bp-Type and Hopfield-type neural networks. Neurocomputing 167:578–586
Qi XX, Yuan ZH, Han XW (2015) Diagnosis of misalignment faults by tacholess order tracking analysis and RBF networks. Neurocomputing 169:439–448
Ekici S, Yildirim S, Poyraz M (2009) A transmission line fault locator based on Elman recurrent networks. Appl Soft Comput 9(1):341–347
Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892
Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70:3056–3062
Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18):3460–3468
Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A (2010) OP-ELM: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21(1):158–162
Feng G, Huang GB, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1352–1357
Lan Y, Soh YC, Huang GB (2010) Two-stage extreme learning machine for regression. Neurocomputing 73(1):3028–3038
Yang YM, Wang YN, Yuan XF (2012) Bidirectional extreme learning machine for regression problem and its learning effectiveness. IEEE Trans Neural Netw Learn Syst 23:1498–1505
Sun ZL, Choi TM, Au KF, Yu Y (2008) Sales forecasting using extreme learning machine with applications in fashion retailing. Decis Support Syst 46(1):411–419
Lan Y, Soh YC, Huang GB (2009) Ensemble of online sequential extreme learning machine. Neurocomputing 135(72):3391–3395
Liao SZ, Feng C (2014) Meta-ELM: ELM with ELM hidden nodes. Neurocomputing 128:81–87
Horn RA, Johnson CR (2012) Matrix analysis. Cambridge University Press
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors (Weidong Zou, Fenxi Yao, Baihai Zhang, Zixiao Guan) of paper (Title: Improved Meta-ELM with error feedback incremental ELM as hidden nodes, NCAA-D-16-00405-R2) declare that there is no conflict of interests.
Rights and permissions
About this article
Cite this article
Zou, W., Yao, F., Zhang, B. et al. Improved Meta-ELM with error feedback incremental ELM as hidden nodes. Neural Comput & Applic 30, 3363–3370 (2018). https://doi.org/10.1007/s00521-017-2922-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-017-2922-y