Skip to main content
Log in

A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions

  • Original Research
  • Published:
Journal of Applied Mathematics and Computing Aims and scope Submit manuscript

Abstract

As we know, conjugate gradient methods are widely used for unconstrained optimization because of the advantages of simple structure and small storage. For non-convex functions, the global convergence analysis of these methods are also crucial. But almost all of them require the gradient Lipschitz continuous condition. Based on the work of Hager and Zhang (Hager and Zhan in SIAM J. Optim. 16:170–192, 2005), Algorithm 1 and Algorithm 2 are proposed and analyzed for the optimization problems. The proposed algorithms possess the sufficient descent property and the trust region feature independent of line search technique. The global convergence of Algorithm 1 is obtained without the gradient Lipschitz continuous condition under the weak Wolfe-Powell inexact line search. Based on Algorithm 1, Algorithm 2 is further improved which global convergence can be obtained independently of line search technique. Numerical experiments are done for Muskingum model and image restoration problems

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38, 401–416 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  2. Birgin, E., Martnez, J. M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim., 43(2001), pp. 117-128

  3. Dai, Y.: Analysis of conjugate gradient methods, Ph.D. Thesis, Institute of Computational Mathe- matics and Scientific/Engineering Computing, Chese Academy of Sciences, 1997

  4. Dai, Y.: Convergence properties of the BFGS algoritm. SIAM J. Optim. 13, 693–701 (2003)

    Article  MATH  Google Scholar 

  5. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim. 10, 177–182 (2000)

    Article  MATH  Google Scholar 

  6. Dai, Y., Yuan, Y.: Nonlinear conjugate gradient Methods, Shanghai Scientific and Technical Publishers, 1998

  7. Fletcher, R.: Practical Methods of Optimization, 2nd edn. John Wiley and Sons, New York (1987)

  8. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  9. Fan, J., Yuan, Y.: A new trust region algorithm with trust region radius converging to zero, D. Li ed. Proceedings of the 5th International Conference on Optimization: Techniques and Applications (December 2001, Hongkong), (2001), pp. 786-794

  10. Geem, Z.W.: Parameter estimation for the nonlinear Muskingum model using the BFGS technique. J. Hydrol. Eng. 132, 21–43 (2006)

    Google Scholar 

  11. Grippo, L., Lucidi, S.: A globally convergent version of the Polak-Ribière-Polyak conjugate gradient method. Math. Program. 78, 375–391 (1997)

    Article  MATH  Google Scholar 

  12. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  13. Hestenes, M.R., Stiefel, E.: Method of conjugate gradient for solving linear equations. J. Res. Nation. Bur. Stand. 49, 409–436 (1952)

    Article  MATH  Google Scholar 

  14. Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  15. Hager, W., Zhang, H.: Algorithm 851: A conjugate gradient method with guaranteed descent. ACM Trans. Math. Soft. 32, 113–137 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  16. Levenberg, K.: A method for the solution of certain nonlinear problem in least squares. Quart. Appl. Math. 2, 164–168 (1944)

    Article  MathSciNet  MATH  Google Scholar 

  17. Li, Q., Li, D.: A class of derivative-free methods for large-scale nonlinear monotone equations. IMA Journal of Numerical Analysis 31, 1625–1635 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  18. Liu, Y., Storey, C.: Effcient generalized conjugate gradient algorithms part 1: Theory. J. Optim. Theo. Appl. 69, 129–137 (1991)

    Article  MATH  Google Scholar 

  19. Li, X., Wang, S., Jin, Z., Pham, H.: A conjugate gradient algorithm under Yuan-Wei-Lu line search technique for large-scale minimization optimization models, Math. Probl. Eng., Volume 2018, pp. 1-11

  20. Martinet, B.: Régularisation d’inéquations variationelles par approxiamations succcessives. Rev. Fr. Inform. Rech. Oper 4, 154–158 (1970)

    Google Scholar 

  21. Ouyang, A., Liu, L., Sheng, Z., et al.: A class of parameter estimation methods for nonlinear Muskingum model using hybrid invasive weed optimization algorithm. Math. Probl. Eng. 2015, 1–15 (2015)

    MathSciNet  MATH  Google Scholar 

  22. Ouyang, A., Tang, Z., Li, K., et al.: Estimating parameters of Muskingum model using an adaptive hybrid PSO algorithm. Int. J. Pattern. Recogn. 28, 1–29 (2014)

    Article  Google Scholar 

  23. Polak, E.: The conjugate gradient method in extreme problems. Comput. Math. Mathem. Phy. 9, 94–112 (1969)

    Article  Google Scholar 

  24. Powell, M. J. D.: Convergence properties of a class of minimization algorithms, Mangasarian, Q.L., Meyer, R.R., Robinson, S.M. (eds.) Nonlinear Programming, Academic Press, New York, 2(1974), pp. 1-27

  25. Powell, M.J.D.: Convergence properties of algorithm for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  26. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. Lecture Notes in Mathematics, vol. 1066. Spinger-Verlag, Berlin (1984)

  27. Polak, E., Ribière, G.: Note sur la convergence de directions conjugees. Rev. Fran. Inf. Rech. Opérat. 3, 35–43 (1969)

    MATH  Google Scholar 

  28. Sheng, Z., Ouyang, A., Liu, L.: et.al., A novel parameter estimation method for Muskingum model using new Newton-type trust region algorithm, Math. Probl. Eng., (2014), Art. ID 634852, pp. 1-7

  29. Sheng, Z., Yuan, G.: An effective adaptive trust region algorithm for nonsmooth minimization. Comput. Optim. Appl. 71, 251–271 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  30. Sheng, Z., Yuan, G., Cui, Z.: A new adaptive trust region algorithm for optimization problems. Acta Math. Scientia 38B(2), 479–496 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  31. Sheng, Z., Yuan, G., Cui, Z., et al.: An adaptive trust region algorithm for large-residual nonsmooth least squares problems. J. Ind. Manage. Optim. 14, 707–718 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  32. Wei, Z., Yao, S., Liu, L.: The convergence properties of some new conjugate gradient methods. Appl. Math. Comput. 183, 1341–1350 (2006)

    MathSciNet  MATH  Google Scholar 

  33. Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Let. 3, 11–21 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  34. Yuan, Y.: Analysis on the conjugate gradient method. Optim. Meth. Soft. 2, 19–29 (1993)

    Article  Google Scholar 

  35. Yuan, G., Lu, X.: A modified PRP conjugate gradient method. Anna. Operat. Res. 166, 73–90 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  36. Yuan, G., Lu, S., Wei, Z.: A new trust-region method with line search for solving symmetric nonlinear equations, Intern. J. Comput. Math. 88, 2109–2123 (2011)

    MathSciNet  MATH  Google Scholar 

  37. Yuan, G., Lu, X., Wei, Z.: A conjugate gradient method with descent direction for unconstrained optimization. J. Comput. Appl. Math. 233, 519–530 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  38. Yuan, G., Lu, X., Wei, Z.: BFGS trust-region method for symmetric nonlinear equations. J. Compu. Appl. Math. 230, 44–58 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  39. Yuan, G., Meng, Z., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory. Appl. 168, 129–152 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  40. Yuan, G., Wei, Z., Li, G.: A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs. J. Comput. Appl. Math. 255, 86–96 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  41. Yuan, G., Wei, Z., Lu, X.: Global convergence of the BFGS method and the PRP method for general functions under a modified weak Wolfe-Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  42. Yuan, G., Wang, X., Sheng, Z.: The projection technique for two open problems of unconstrained optimization problems. Journal of Optimization Theory and Applications 186, 590–619 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  43. Yuan, G., Wei, Z., Yang, Y.: The global convergence of the Polak-Ribière-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions. J. Comput. Appl. Math. 362, 262–275 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  44. Yuan, G., Zhang, M.: A three-terms Polak-Ribière-Polyak conjugate gradient algorithm for large-scale nonlinear equations. J. Comput. Appl. Math. 286, 186–195 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  45. Zoutendijk, G.: Nonlinear programming computational methods. In: Abadie, J. (ed.) Integer and Nonlinear programming, pp. 37–86. Northholland, Amsterdam (1970)

  46. Zhang, L., Zhou, W., Li, D.: A descent modified Polak-Ribière-Polyak conjugate method and its global convergence. IMA Journal on Numerical Analysis 26, 629–649 (2006)

    Article  MATH  Google Scholar 

  47. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reèves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant No. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education (Grant No. [2019]52), the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046), the Special Funds for Local Science and Technology Development Guided by the Central Government (No. ZY20198003), Innovation Project of Guangxi Graduate Education (YCBZ2021027), the special foundation for Guangxi Ba Gui Scholars.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ailun Jian.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, G., Jian, A., Zhang, M. et al. A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions. J. Appl. Math. Comput. 68, 4691–4712 (2022). https://doi.org/10.1007/s12190-022-01724-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12190-022-01724-z

Keywords

Mathematics subject classifications

Navigation