Skip to main content

The Effect of Two-Term Parameters on the Performance of Third-Order Neural Network on Medical Classification Problems

  • Conference paper
  • First Online:
Intelligent and Interactive Computing

Abstract

Artificial neural networks (ANN) also known as backpropagation (BP) algorithm has recently been applied in many areas and applications. BP algorithm is known as an excellent classifier for nonlinear input and output numerical data. However, the popularity of BP comes with some weaknesses which are moderate in learning and easily getting stuck in local minima. A vigorous area of research and papers on enlightening the performance of BP algorithm are revised especially in terms of literature. Furthermore, the performance of BP algorithm also highly influenced by the order of learning and the parameters that been chosen. This paper presents an improvement of BP algorithm by adjusting the two-term parameters on the performance of third-order neural network methods. The effectiveness of the proposed method is proven through the methods of stimulation on medical classification issues. The results show that the proposed implementation significantly improves the learning speed of the general BP algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning internal representations by error propagation. In: David ER, James LM, CPR Group (eds) Parallel distributed processing: explorations in the microstructure of cognition, vol 1. MIT Press, pp 318–362

    Google Scholar 

  2. Zhang SL, Chang TC (2016) A study of image classification of remote sensing based on back-propagation neural network with extended delta bar delta. Math Probl Eng 2015:10

    Google Scholar 

  3. Nawi NM, Rehman M, Khan A (2015) WS-BP: An efficient wolf search based back-propagation algorithm. In: International conference on mathematics, engineering and industrial applications 2014 (ICoMEIA 2014). AIP Publishing

    Google Scholar 

  4. Rehman MZ, Nawi NM (2011) The effect of adaptive momentum in improving the accuracy of gradient descent back propagation algorithm on classification problems. In: Mohamad Zain J, Wan Mohd WMB, El-Qawasmeh E (eds) Software engineering and computer systems: second international conference, ICSECS 2011, Kuantan, Pahang, Malaysia, Proceedings, Part I, 27–29 June 2011. Springer Berlin Heidelberg, Berlin, Heidelberg, pp 380–390

    Chapter  Google Scholar 

  5. Nawi NM, Khan A, Rehman MZ (2013) A new back-propagation neural network optimized with cuckoo search algorithm. In: International conference on computational science and its applications, 2013. Springer Berlin Heidelberg

    Google Scholar 

  6. Liu Y, Jing W, Xu L (2016) Parallelizing backpropagation neural network using MapReduce and cascading model. Comput Intell Neurosci 2016:11

    Google Scholar 

  7. Chen Y et al (2016) Three-dimensional short-term prediction model of dissolved oxygen content based on PSO-BPANN algorithm coupled with Kriging interpolation. Math Probl Eng 2016:10

    Google Scholar 

  8. Cao J, Chen J, Li H (2014) An adaboost-backpropagation neural network for automated image sentiment classification. Sci World J 2014:9

    Google Scholar 

  9. Abdul Hamid N et al (2012) A review on improvement of back propagation algorithm. Glob J Technol 1

    Google Scholar 

  10. Van Ooyen A, Nienhuis B (1992) Improving the convergence of the back-propagation algorithm. Neural Netw 5(3):465–471

    Article  Google Scholar 

  11. Maier HR, Dandy GC (1998) The effect of internal parameters and geometry on the performance of back-propagation neural networks: an empirical study. Environ Model Softw 13(2):193–209

    Article  Google Scholar 

  12. Thimm G, Moerland P, Fiesler E (1996) The interchangeability of learning rate and gain in backpropagation neural networks. Neural Comput 8(2):451–460

    Article  Google Scholar 

  13. Nawi NM, Ransing R, Hamid NA (2010) BPGD-AG: a new improvement of back-propagation neural network learning algorithms with adaptive gain. J Sci Technol 2(2)

    Google Scholar 

  14. Nawi NM et al (2010) An improved back propagation neural network algorithm on classification problems. In: Zhang Y et al (eds) Database theory and application, bio-science and bio-technology: international conferences, DTA and BSBT 2010, Held as part of the future generation information technology conference, FGIT 2010, Jeju Island, Korea, Proceedings, 13–15 Dec 2010. Springer Berlin Heidelberg, Berlin, Heidelberg, pp 177–188

    Chapter  Google Scholar 

  15. Nawi NM et al (2011) Enhancing back propagation neural network algorithm with adaptive gain on classification problems. Networks 4(2)

    Google Scholar 

  16. Haykin S (1998) Neural networks: a comprehensive foundation. Prentice Hall PTR, p 842

    Google Scholar 

  17. Kumar P, Merchant SN, Desai UB (2004) Improving performance in pulse radar detection using Bayesian regularization for neural network training. Digit Sign Proc 14(5):438–448

    Article  Google Scholar 

  18. Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Inc., p 482

    Google Scholar 

  19. Fletcher R, Powell MJD (1963) A rapidly convergent descent method for minimization. Comput J 6(2):163–168

    Article  MathSciNet  Google Scholar 

  20. Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7(2):149–154

    Article  MathSciNet  Google Scholar 

  21. Hestenes M, Stiefel E (1952) Methods of conjugate gradients for solving linear systems. J Res National Bureau Stand 49(6):409–436

    Article  MathSciNet  Google Scholar 

  22. Rivaie M et al (2012) A new class of nonlinear conjugate gradient coefficients with global convergence properties. Appl Math Comput 218(22):11323–11332

    MathSciNet  MATH  Google Scholar 

  23. Nawi NM, Hamid NA, Samsudin NA, Yunus MAM, Ab Aziz MF (2017) Second order learning algorithm for back propagation neural networks. Int J Adv Sci Eng Inf Technol 7(4):1162–1171

    Article  Google Scholar 

  24. Sheperd AJ (1997) Second order methods for neural networks-fast and reliable training methods for multi-layer perceptrons. In: Taylor JG (ed). Springer, p 143

    Google Scholar 

  25. Byatt D, Coope ID, Price CJ (2004) Effect of limited precision on the BFGS quasi-Newton algorithm. ANZIAM J 45:283–295

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Universiti Tun Hussein Onn Malaysia (UTHM) and Ministry of Higher Education (MOHE) Malaysia for financially supporting this Research under IGSP grants note U420 and under Trans-disciplinary Research Grant Scheme (TRGS) vote no. T003.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nazri Mohd Nawi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Nawi, N.M., Roslan, K.M., Kamal, S.B.M., Hamid, N.A. (2019). The Effect of Two-Term Parameters on the Performance of Third-Order Neural Network on Medical Classification Problems. In: Piuri, V., Balas, V., Borah, S., Syed Ahmad, S. (eds) Intelligent and Interactive Computing. Lecture Notes in Networks and Systems, vol 67. Springer, Singapore. https://doi.org/10.1007/978-981-13-6031-2_20

Download citation

Publish with us

Policies and ethics