Abstract
Complex-valued neural network is a type of neural networks, which is extended from real number domain to complex number domain. Fully complex extreme learning machine (CELM) is an efficient algorithm, which owes faster convergence than the common complex backpropagation (CBP) neural networks. However, it needs more hidden neurons to reach competitive performance. Recently, an efficient learning algorithm is proposed for the single-hidden layer feed-forward neural network which is called the upper-layer-solution-aware algorithm (USA). Motivated by USA, an efficient algorithm for complex-valued neural networks through training input weights (GGICNN) has been proposed to train the split complex-valued neural networks in this paper. Compared with CELM and CBP, an illustrated experiment has been done in detail which observes the better generalization ability and more compact architecture for the proposed algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Hirose, A.: Complex-Valued Neural Networks. World Scientific, Singapore (2003)
Cha, I., Kassam, S.A.: Channel equalization using adaptive complex radial basis function networks. IEEE J. Sel. Areas Commun. 13, 122–131 (1995)
Aizenberg, I.: Complex-Valued Neural Networks with Multi-valued Neurons. Springer, Berlin (2011). doi:10.1007/978-3-642-20353-4. Finance, A.: Multivariate nonlinear analysis and prediction of Shanghai stock market. Discret. Dyn. Nat. Soc. 47–58 (2008)
Serre, D.: Matrices: Theory and Applications. Springer, New York (2002). doi:10.1007/978-1-4419-7683-3
Rakkiyappan, R., Velmurugan, G., Cao, J.: Stability analysis of fractional-order complex-valued neural networks with time delays. Chaos Solitons Fractals Interdisc. J. Nonlinear Sci. Nonequilibrium Complex Phenom. 78, 297–316 (2015)
Leung, H., Haykin, S.: The complex backpropagation algorithm. IEEE Trans. Signal Process. 39, 2101–2104 (1991)
Nitta, T.: An extension of the back-propagation algorithm to complex numbers. Neural Netw. Off. J. Int. Neural Netw. Soc. 10, 1391–1415 (1997)
Zhang, H., Zhang, C., Wu, W.: Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks. Discret. Dyn. Nat. Soc. 2009, 332–337 (2009)
Zhang, H., Xu, D., Zhang, Y.: Boundedness and convergence of split-complex back-propagation algorithm with momentum and penalty. Neural Process. Lett. 39, 297–307 (2014)
Zhang, H., Liu, X., Xu, D., Zhang, Y.: Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus. Cogn. Neurodyn. 46, 5789–5796 (2014)
Zhang, H., Mandic, D.P.: Is a complex-valued stepsize advantageous in complex-valued gradient learning algorithms? IEEE Trans. Neural Netw. Learn. Syst. 27, 1–6 (2015)
Xu, D., Dong, J., Zhang, H.: Deterministic convergence of Wirtinger-gradient methods for complex-valued neural networks. Neural Process. Lett. 1–12 (2016)
Rao, C.R., Mitra, S.K.: Generalized Inverse of Matrices and its Applications. Wiley, New York (1971)
Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70, 489–501 (2006)
Yu, D., Deng, L.: Efficient and effective algorithms for training single-hidden-layer neural networks. Pattern Recogn. Lett. 33, 554–558 (2012)
Li, M.B., Huang, G.B., Saratchandran, P., Sundararajan, N.: Fully complex extreme learning machine. Neurocomputing 68, 306–314 (2005)
Shukla, S., Yadav, R.N.: Regularized weighted circular complex-valued extreme learning machine for imbalanced learning. IEEE Access 3048–3057 (2016)
Kim, T., Adal, T.: Approximation by fully complex multilayer perceptrons. Neural Comput. 15, 1641–1666 (2003)
Suresh, S., Savitha, R., Sundararajan, N.: Supervised Learning with Complex-Valued Neural Networks. Studies in Computational Intelligence. Springer, Heidelberg (2013). doi:10.1007/978-3-642-29491-4
Acknowledgements
This work was supported in part by the National Natural Science Foundation of China (No. 61305075), the China Postdoctoral Science Foundation (No. 2012M520624), the Natural Science Foundation of Shandong Province (No. ZR2013FQ004, ZR2013DM015, ZR2015AL014), the Specialized Research Fund for the Doctoral Program of Higher Education of China (No. 20130133120014) and the Fundamental Research Funds for the Central Universities (Nos. 14CX05042A, 15CX05053A, 15CX02079A, 15CX08011A, 15CX02064A).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Liu, Q., Sang, Z., Chen, H., Wang, J., Zhang, H. (2017). An Efficient Algorithm for Complex-Valued Neural Networks Through Training Input Weights. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10637. Springer, Cham. https://doi.org/10.1007/978-3-319-70093-9_16
Download citation
DOI: https://doi.org/10.1007/978-3-319-70093-9_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70092-2
Online ISBN: 978-3-319-70093-9
eBook Packages: Computer ScienceComputer Science (R0)