A decent three term conjugate gradient method with global convergence properties for large scale unconstrained optimization problems

: The conjugate gradient (CG) method is a method to solve unconstrained optimization problems. Moreover CG method can be applied in medical science, industry, neural network, and many others. In this paper a new three term CG method is proposed. The new CG formula is constructed based on DL and WYL CG formulas to be non-negative and inherits the properties of HS formula. The new modification satisfies the convergence properties and the sufficient descent property. The numerical results show that the new modification is more efficient than DL, WYL


Introduction
The conjugate gradient (CG) method is a method to solve large scale unconstrained optimization problems, we consider the following problem where is a continuous and differentiable function and the gradient is available. The CG method generates a sequence as follows: (2) where is the current point (iteration) and is a steplength. The search direction of the CG method is defined as follows: where and k  is known as the CG formula. To compute the steplength normally we use the strong Wolfe-Powell (SWP) [1,2] line search is defined as follows: and The SWP line search is a strong version of the weak Wolfe-Powell (WWP) line search; the latter is given by (4) and The most famous classical formulas of CG methods Hestenses-Stiefel (HS) [3], Polak-Ribiere-Polyak (PRP) [4], Liu and Storey (LS) [5], Fletcher-Reeves (FR) [6], Fletcher (CD) [7], Dai and Yuan (DY) [8], are as follows: Polak and Ribiè re [4] proven that PRP method with exact line search is globally converge. In the other hand Powell [9] proposed an example show that there exists a function does not global convergence even if the exact line search is employed with PRP formula. Powell suggests using non-negative value of PRP method. Gilbert and Nocedal [10] proved that if } , 0 max{  (2) and (7), [10] proposed the following CG formula Thus [10] replaced Eq (8) by Hager and Zhang [14,15] presented a modified CG parameter that satisfies the descent property for any inexact line search with This new version of CG method is globally convergent whenever the line search satisfies the (WP) line search. This formula is given as follows:  [16]. gave a new positive CG method, which is quite similar to original PRP method which has a global convergence under exact and inexact line search that is, In 2016, Alhawarat et al. [17] presented the following formula where  represents the Euclidean norm. And k  is defined as follows: Kaelo et al. [18] proposed the following CG formula Yao et al. [19] proposed three terms of CG with a new choice of t as follows: Based on the SWP line search, Yao et al. [19] selected to satisfy the descent condition as follows: Yao et al. [19] also proposed a theorem stating that if is close to For more about CG method and its application, the reader can refer to the following references [20][21][22][23][24].

The new formula and the algorithm
Since DL k  method in (8) Algorithm 1 shows that the steps to find the optimal solution of optimization function. Algorithm 1. The steps of CG methods with Eq (10) to obtain the optimum method.
Step 1: Provide a starting point Set the initial search direction Let .
Step 2: If a stopping criteria is satisfied, then stop.
Step 6: Set and go to Step 2.

Global convergence analysis of the CG algorithm with the coefficient
To establish the convergence properties of the new formula, the following assumption is required.
is bounded, that is, a positive constant exists such that B. In some neighbourhood Q of  , f is continuously differentiable, and its gradient is Lipschitz continuous; that is, for all , , This assumption implies that there exists a positive constant B such that is useful in the study of CG method and serves important rule in the proof of global convergence analysis. Abaali [12] modified (12) to the following form and used it to prove the FR method . Equation (14) is the sufficient descent condition. Note that the general form of the sufficient descent condition is (14) with Moreover, using (13) is better than (12) (14) holds.
Proof. By Multiplying (2) by , T k g we obtain Then we have the following two cases: .
Divide both sides by
thus we obtain the result Divide both sides by The following lemma, which is referred to as the Zoutendijk condition [11], is useful for analysing the global convergence property of the CG method.
Thus we obtain the result. (1), (2), and k  satisfies the WWP line search (5) and (6), in which the search direction is descent. Then, the following condition holds:

Lemma 3.2. Let Assumption 1 be holds. Consider any CG method in the form
3.2. Global convergence for The following property, which is referred to as Property*, was presented by Gilbert and Nocedal in [10]. This property is useful to obtain the global convergence properties of CG methods related to PRP or HS family. The property is given as follows: Property* Consider a method of the form (1) and (2). Assume that has Property*.
The forthcoming lemmas correspond to Lemmas 4.1 and 4.2 in [10]. are generated by Algorithm 1 in which k  is computed by the WWP line search in which the sufficient descent condition (16) holds, and assume that the method has Property*. Suppose that (17) From Lemmas 3.1 and 3.3-3.5, the global convergence of Algorithm 1 with the WWP line search can be established in a manner that is similar to that of Theorem 4.3 in [10]; therefore, the proof of the following theorem is omitted.

Numerical results and discussion
To analyse the efficiency of the new formula, we selected several test problems in Table 1 from CUTEr [25]. We performed a comparison with other CG coefficients, including CG-Descent, DY, and WYL coefficients based on the CPU time, number of iterations, number of function evaluations, and number of gradient evaluations. In Table 1 we define the following abbreviations as follows: No  [14,15] in the other hand we can note that