Skip to main content
Log in

A descent hybrid conjugate gradient method based on the memoryless BFGS update

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Al-Baali, M.: Descent property and global convergence of the Fletcher-Reeves method with inexact line search. IMA J. Numer. Anal. 5, 121–124 (1985)

    Article  MathSciNet  Google Scholar 

  2. Al-Baali, M.: Analysis of a family self-scaling quasi-Newton methods. Comput. Optim. Appl. 9, 191–203 (1998)

    Article  MathSciNet  Google Scholar 

  3. Al-Baali, M.: Numerical experience with a class of self-scaling quasi-Newton algorithms. J. Optim. Theory 96, 533–553 (1998)

    Article  MathSciNet  Google Scholar 

  4. Al-Baali, M.: Extra updates for the BFGS method. Optim. Method Softw. 13, 159–179 (2000)

    Article  MathSciNet  Google Scholar 

  5. Al-Baali, M., Spedicato, E., Maggioni, F.: Broyden’s quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems. Optim. Method Softw. 29(5), 937–954 (2014)

    Article  MathSciNet  Google Scholar 

  6. Alekseev, A.K., Navon, I.M., Steward, J.L.: Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems. Optim. Method Softw. 24(1), 63–87 (2009)

    Article  MathSciNet  Google Scholar 

  7. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Method Softw. 22, 561–571 (2007)

    Article  MathSciNet  Google Scholar 

  8. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algo. 47, 143–156 (2008)

    Article  MathSciNet  Google Scholar 

  9. Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141, 249–264 (2009)

    Article  MathSciNet  Google Scholar 

  10. Andrei, N.: Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer. Algo. 54, 23–46 (2010)

    Article  MathSciNet  Google Scholar 

  11. Apostolopoulou, M.S., Sotiropoulos, D.G., Livieris, I.E., Pintelas, P.: A Memoryless BFGS neural network training algorithm. In: 7Th IEEE International Conference on Industrial Informatics (INDIN’09), pp 216–221 (2009)

  12. Babaie-Kafaki, S., Fatemi, M., Mahdavi-Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algo. 58, 315–331 (2011)

    Article  MathSciNet  Google Scholar 

  13. Babaie-Kafaki, S., Ghanbari, R.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization: A journal of mathematical programming and operations research, 1–16 (2012)

  14. Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  Google Scholar 

  15. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim. Method Softw. 30(4), 673–681 (2015)

    Article  MathSciNet  Google Scholar 

  16. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. Numer. Algo. 68(3), 481–495 (2015)

    Article  Google Scholar 

  17. Babaie-Kafaki, S., Ghanbari, R.: Two optimal Dai-Laio conjugate gradient methods. Optimization 64, 2277–2287 (2015)

    Article  MathSciNet  Google Scholar 

  18. Babaie-Kafaki, S., Ghanbari, R.: A class of adaptive Dai-Laio conjugate gradient methods based on scaled memoryless BFGS update 4OR, 1–8 (2016)

  19. Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)

    Article  MathSciNet  Google Scholar 

  20. Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: Constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21(1), 123–160 (1995)

    Article  Google Scholar 

  21. Burstedde, C., Kunoth, A.: The conjugate gradient method for linear ill-posed problems with operator perturbations. Numer. Algo. 48(1), 161–188 (2008)

    Article  MathSciNet  Google Scholar 

  22. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  23. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient with a strong global convergence properties. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  24. Dai, Y.H., Yuan, Y.X.: Nonlinear Conjugate Gradient Methods. Shanghai Scientific and Technical Publishers, Shanghai (2000)

    Google Scholar 

  25. Dai, Y.H., Yuan, Y.X.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)

    Article  MathSciNet  Google Scholar 

  26. Dolan, E., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  27. Fletcher, R.: Practical Methods of Optimization, Volume 1: Unconstrained Optimization , 1st edition. Wiley, New York (1987)

    MATH  Google Scholar 

  28. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  29. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  30. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim. 16, 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  31. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  32. Hestenes, M.R., Stiefel, E.: Methods for conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  33. Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71, 399–405 (1991)

    Article  MathSciNet  Google Scholar 

  34. Kou, C.X., Dai, Y.H.: A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. Journal of Optimization Theory and Applications (2014)

  35. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization methods. Math. Program. 45, 503–528 (1989)

    Article  MathSciNet  Google Scholar 

  36. Liu, Q.: Two minimal positive bases based direct search conjugate gradient methods for computationally expensive functions. Numer. Algo. 58(4), 461–474 (2011)

    Article  MathSciNet  Google Scholar 

  37. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    Article  MathSciNet  Google Scholar 

  38. Livieris, I.E., Pintelas, P.: A new conjugate gradient algorithm for training neural networks based on a modified secant equation. Appl. Math. Comput. 221, 491–502 (2013)

    MathSciNet  MATH  Google Scholar 

  39. Livieris, I.E., Pintelas, P.: A limited memory descent Perry conjugate gradient method. Optim. Lett. 10, 17–25 (2016)

    Article  MathSciNet  Google Scholar 

  40. Morales, J.L., Nocedal, J.: Enriched methods for large-scale unconstrained optimization. Comput. Optim. Appl. 21, 143–154 (2002)

    Article  MathSciNet  Google Scholar 

  41. Nash, S.G.: Newton-type minimization via the Lanczos method. SIAM J Numer. Anal. 21, 770–788 (1984)

    Article  MathSciNet  Google Scholar 

  42. Nash, S.G.: Preconditioning of truncated Newton methods. SIAM J Sci. Stat. Comput. 6, 599–616 (1985)

    Article  MathSciNet  Google Scholar 

  43. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)

    Article  MathSciNet  Google Scholar 

  44. Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numerica 1, 199–242 (1992)

    Article  MathSciNet  Google Scholar 

  45. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    Book  Google Scholar 

  46. Nocedal, J., Yuan, Y.: Analysis of a self-scaling quasi-Newton method. Math. Program. 61, 19–37 (1993)

    Article  MathSciNet  Google Scholar 

  47. Oren, S.S.: Self-Scaling Variable Metric Algorithms for Unconstrained Minimization. PhD Thesis, Stanford University, California (1972)

    Google Scholar 

  48. Oren, S.S., Luenberger, D.G.: Self-scaling variable metric (SSVM) algorithms, Part I: criteria and sufficient conditions for scaling a class of algorithms. Manag. Sci. 20, 845–862 (1974)

    Article  Google Scholar 

  49. Oren, S.S., Spedicato, E.: Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10, 70–90 (1976)

    Article  MathSciNet  Google Scholar 

  50. Perry, J.M.: A Class of Conjugate Gradient Algorithms with a Two-Step Variable-Metric Memory. Center for Mathematical Studies in Economies and Management Science. Northwestern University Press, Evanston Illiois (1977)

    Google Scholar 

  51. Plato, R.: The conjugate gradient method for linear ill-posed problems with operator perturbations. Numer. Algo. 20(1), 1–22 (1999)

    Article  MathSciNet  Google Scholar 

  52. Polak, E., Ribière, G.: Note sur la convergence de methods de directions conjuguees. Revue Francais d’Informatique et de Recherche Operationnelle 16, 35–43 (1969)

    MATH  Google Scholar 

  53. Powell, M.J.D.: Some global convergence properties of a variable metric algorithm for minimization without exact line searches. In: Cottle, R.W., Lemke, C.E. (eds.) Nonlinear Programming, SIAM-AMS Proceedings, vol. IX, pp 53–72. SIAM Publications (1976)

  54. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)

    Article  MathSciNet  Google Scholar 

  55. Powell, M.J.D.: Nonconvex Minimization Calculations and the Conjugate Gradient Method. In: Numerical Analysis, Volume 1066 of Lecture Notes in Mathematics, pp 122–141. Springer, Berlin (1984)

    Google Scholar 

  56. Risler, F., Rey, C.: Iterative accelerating algorithms with Krylov subspaces for the solution to large-scale nonlinear problems. Numerical Algorithms, 23(1) (2000)

    Article  MathSciNet  Google Scholar 

  57. Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15(6), 1247–1257 (1978)

    Article  MathSciNet  Google Scholar 

  58. Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64, 379–397 (1990)

    Article  MathSciNet  Google Scholar 

  59. Wu, X., Silva, B., Yuan, J.: Conjugate gradient method for rank deficient saddle point problems. Numer. Algo. 35(2), 139–154 (2004)

    Article  MathSciNet  Google Scholar 

  60. Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216, 251–164 (2008)

    Article  MathSciNet  Google Scholar 

  61. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)

    Article  MathSciNet  Google Scholar 

  62. Zou, X., Navon, I.M., Berger, M., Phua, K.H., Schlick, T., Le Dimet, F.X.: Numerical experience with limited-memory quasi-Newton and truncated Newton methods. SIAM J Optim. 3(3), 582–608 (1993)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ioannis E. Livieris.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Livieris, I.E., Tampakas, V. & Pintelas, P. A descent hybrid conjugate gradient method based on the memoryless BFGS update. Numer Algor 79, 1169–1185 (2018). https://doi.org/10.1007/s11075-018-0479-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-018-0479-1

Keywords

Navigation