A new class of three-term conjugate gradient methods for solving unconstrained minimization problems

Conjugate gradient (CG) methods which are usually generate descent search directions, are beneficial for large-scale unconstrained optimization models, because of its low memory requirement and simplicity. This paper studies the three-term CG method for unconstrained optimization. The modified a three-term CG method based on the formal 𝒕 ∗ which is suggested by Kafaki and Ghanbari [11], and using some well-known CG formulas for unconstrained optimization. Our proposed method satisfies both (the descent and the sufficient descent) conditions. Furthermore, if we use the exact line search the new proposed is reduce to the classical CG method. The numerical results show that the suggested method is promising and exhibits a better numerical performance in comparison with the three-term (ZHS-CG) method from an implementation of the suggested method on some normal unconstrained optimization test functions.


Introduction
In this paper, we are interested to resolving unconstrained optimization problem, particularly for large scale which is given by the form ℎ( ) ∀ ∈ (1.1) Where ℎ ∶ → and ℎ ∈ 2 . Numerous similar professional fields of science that can revert to the above optimization problem (see, e.g., [7,19]). A nonlinear conjugate gradient method is an iterative scheme that creates a sequence { } of an approximation to the solution of (1.1), using the repetition: +1 = + , = 0,1,2, … , = (1.2) Where > 0 is the step length, the present iterate is , and the search direction is designed by: Where = ℎ ( ) is the gradient and is a scalar and called the conjugate gradient parameter. For example, Hestenes and Stiefel (HS) [9], Polak-Ribiere-Polyak (PRP) [14], Fletcher and Reeves (FR) [5], used an update parameter, respectively, given by: = +1 (1.4) = +1 (1.5) = +1 +1 (1.6) Where = +1 − . The PRP method is very effective as regards numerical performance, but it failure as regards the global convergence for the general functions under Wolfe line search method and that is open problem many researcher want to solve it. It is worth noting that a recent work of Yuan et al. [16] proved the global convergence of PRP method under a modified Wolfe line search method for general functions. Al-Baali [2], Gilbert and Nocedal [8] and Hu and Storey [10] hinted that the sufficient descent property may be decisive for the global convergence of the CG methods including the PRP method.
The CG method have another important class is called the three-term conjugate gradient method in which the search direction is determined as a linear combination of , , and as Where 1 and 2 are scalar. Among the generated three-term conjugate gradient methods in the literature we have the three-term conjugate methods proposed by Zhang et al. [17,18] by considering a descent modified PRP and also a descent modified HS conjugate gradient method as In the same manner Nazareth [13] submit a computationally effective three-term nonlinear conjugate gradient method with the following search direction: The step-length λ i in (1.2) is computed by carrying out a line search. The standard strong Wolfe line search [15] is to compute λ i such that The residual part of this paper is organized as follows: Section two, transact with the derivation of the new three-term conjugate gradient method (NTT-CG). In section three, we present prove of descent condition and sufficient descent condition of the (NTT-CG) method. The numerical results and discussion are reported in Section four. Finally, concluding the paper with final remarks in the last section.

Derivation A New Three-Term Conjugate Gradient Method (NTT-CG)
In this work, we transact with a modified form of the search direction (1.9). In 2014 Kafaki and Ghanbari [11] proposed a new value of * as follow: Now, let the parameter is chosen here such that it presents the convex combination of first and second term of equation (2.1). Hence where ∈ (0,1) and by adding a parameter on the third term of were ≥ 1 with changed parameter in second term as one of classical formula of CG method. More precisely, the search direction of our method, named as (NTT-CG) method, defined by: and the parameter is given from normal CG method, in this paper we use the formula of Hestenes and Stiefel (HS) , Polak-Ribiere-Polyak (PRP) and Fletcher and Reeves (FR). Then the new search directions can be written as follows: are positive, therefore the parameter is greater than or equal to zero.
Step 3 : Determine the step size by using cubic line search to minimize ℎ( + ).

The Descent And Sufficient Descent Conditions Of The New Three-Term (NTT-CG) Method
An important feature for any minimization algorithm is the descent and the sufficient descent conditions. In this section, we examine the behaviour of decent and sufficient decent conditions of the (NTT-CG) method. Now, multiplying the equation (

Numerical results and discussion
This section will report the numerical experiment of the modified three term CG ( − ) with ( − ) methods. The tests include well-known nonlinear problems standard test functions [3], with size of the variable ranging from 10 ≤ ≤ 5000. The code of the algorithm is written by FORTRAN 95 language with given initial points. In Table (

Conclusion
In this paper, a modified three-term conjugate gradient method for solving nonlinear unconstrained optimization in formula (2.2) based on a formula * by using some well-known CG formulas (HS, PRP and FR) for unconstrained optimization is presented. The proposed method possesses descent and sufficient descent conditions also holds without any line search technique. The numerical results show that the proposed method is promising and more efficient than the normal method ( − ) considered. In future we can use the ( − ) method for training the neural network in order to investigate the performance of its behaviour.