Abstract
Artificial neural networks (ANN) also known as backpropagation (BP) algorithm has recently been applied in many areas and applications. BP algorithm is known as an excellent classifier for nonlinear input and output numerical data. However, the popularity of BP comes with some weaknesses which are moderate in learning and easily getting stuck in local minima. A vigorous area of research and papers on enlightening the performance of BP algorithm are revised especially in terms of literature. Furthermore, the performance of BP algorithm also highly influenced by the order of learning and the parameters that been chosen. This paper presents an improvement of BP algorithm by adjusting the two-term parameters on the performance of third-order neural network methods. The effectiveness of the proposed method is proven through the methods of stimulation on medical classification issues. The results show that the proposed implementation significantly improves the learning speed of the general BP algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning internal representations by error propagation. In: David ER, James LM, CPR Group (eds) Parallel distributed processing: explorations in the microstructure of cognition, vol 1. MIT Press, pp 318–362
Zhang SL, Chang TC (2016) A study of image classification of remote sensing based on back-propagation neural network with extended delta bar delta. Math Probl Eng 2015:10
Nawi NM, Rehman M, Khan A (2015) WS-BP: An efficient wolf search based back-propagation algorithm. In: International conference on mathematics, engineering and industrial applications 2014 (ICoMEIA 2014). AIP Publishing
Rehman MZ, Nawi NM (2011) The effect of adaptive momentum in improving the accuracy of gradient descent back propagation algorithm on classification problems. In: Mohamad Zain J, Wan Mohd WMB, El-Qawasmeh E (eds) Software engineering and computer systems: second international conference, ICSECS 2011, Kuantan, Pahang, Malaysia, Proceedings, Part I, 27–29 June 2011. Springer Berlin Heidelberg, Berlin, Heidelberg, pp 380–390
Nawi NM, Khan A, Rehman MZ (2013) A new back-propagation neural network optimized with cuckoo search algorithm. In: International conference on computational science and its applications, 2013. Springer Berlin Heidelberg
Liu Y, Jing W, Xu L (2016) Parallelizing backpropagation neural network using MapReduce and cascading model. Comput Intell Neurosci 2016:11
Chen Y et al (2016) Three-dimensional short-term prediction model of dissolved oxygen content based on PSO-BPANN algorithm coupled with Kriging interpolation. Math Probl Eng 2016:10
Cao J, Chen J, Li H (2014) An adaboost-backpropagation neural network for automated image sentiment classification. Sci World J 2014:9
Abdul Hamid N et al (2012) A review on improvement of back propagation algorithm. Glob J Technol 1
Van Ooyen A, Nienhuis B (1992) Improving the convergence of the back-propagation algorithm. Neural Netw 5(3):465–471
Maier HR, Dandy GC (1998) The effect of internal parameters and geometry on the performance of back-propagation neural networks: an empirical study. Environ Model Softw 13(2):193–209
Thimm G, Moerland P, Fiesler E (1996) The interchangeability of learning rate and gain in backpropagation neural networks. Neural Comput 8(2):451–460
Nawi NM, Ransing R, Hamid NA (2010) BPGD-AG: a new improvement of back-propagation neural network learning algorithms with adaptive gain. J Sci Technol 2(2)
Nawi NM et al (2010) An improved back propagation neural network algorithm on classification problems. In: Zhang Y et al (eds) Database theory and application, bio-science and bio-technology: international conferences, DTA and BSBT 2010, Held as part of the future generation information technology conference, FGIT 2010, Jeju Island, Korea, Proceedings, 13–15 Dec 2010. Springer Berlin Heidelberg, Berlin, Heidelberg, pp 177–188
Nawi NM et al (2011) Enhancing back propagation neural network algorithm with adaptive gain on classification problems. Networks 4(2)
Haykin S (1998) Neural networks: a comprehensive foundation. Prentice Hall PTR, p 842
Kumar P, Merchant SN, Desai UB (2004) Improving performance in pulse radar detection using Bayesian regularization for neural network training. Digit Sign Proc 14(5):438–448
Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Inc., p 482
Fletcher R, Powell MJD (1963) A rapidly convergent descent method for minimization. Comput J 6(2):163–168
Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7(2):149–154
Hestenes M, Stiefel E (1952) Methods of conjugate gradients for solving linear systems. J Res National Bureau Stand 49(6):409–436
Rivaie M et al (2012) A new class of nonlinear conjugate gradient coefficients with global convergence properties. Appl Math Comput 218(22):11323–11332
Nawi NM, Hamid NA, Samsudin NA, Yunus MAM, Ab Aziz MF (2017) Second order learning algorithm for back propagation neural networks. Int J Adv Sci Eng Inf Technol 7(4):1162–1171
Sheperd AJ (1997) Second order methods for neural networks-fast and reliable training methods for multi-layer perceptrons. In: Taylor JG (ed). Springer, p 143
Byatt D, Coope ID, Price CJ (2004) Effect of limited precision on the BFGS quasi-Newton algorithm. ANZIAM J 45:283–295
Acknowledgements
The authors would like to thank Universiti Tun Hussein Onn Malaysia (UTHM) and Ministry of Higher Education (MOHE) Malaysia for financially supporting this Research under IGSP grants note U420 and under Trans-disciplinary Research Grant Scheme (TRGS) vote no. T003.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Nawi, N.M., Roslan, K.M., Kamal, S.B.M., Hamid, N.A. (2019). The Effect of Two-Term Parameters on the Performance of Third-Order Neural Network on Medical Classification Problems. In: Piuri, V., Balas, V., Borah, S., Syed Ahmad, S. (eds) Intelligent and Interactive Computing. Lecture Notes in Networks and Systems, vol 67. Springer, Singapore. https://doi.org/10.1007/978-981-13-6031-2_20
Download citation
DOI: https://doi.org/10.1007/978-981-13-6031-2_20
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-6030-5
Online ISBN: 978-981-13-6031-2
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)