A New Projection Technique with Gradient Property to Solve Optimization Problems

In this study, a new gradient projection technique has been proposed that consists of three boundaries with achieving the unadulterated descent feature. In this technique, we worked on combining the conjugate gradient algorithm with projection techniques to obtain a new algorithm for solving a wide range of unconstrained optimization problems. We have established global convergence with some hypotheses, and it has become clear to us through our results that the new formula is good and promised.


Introduction
Numerous applications of the conjugate gradient projection technique exist, including mathematical programming and machine learning [1,2]. It is a technique for solving unconstrained optimization problems that is both efficient and effective: where : → is a continuously differentiable real-valued function. The iterative gradient projection technique is an effective tool for solving problems (1) [3,4]. To solve this problem using recurrence, the iterative sequence is generated from the initial point 0 ∈ .

The New Algorithm
We define the framework of the proposed approach in this work and show that it satisfies a nice descent property. For large-scale unconstrained optimization problems, we used the projection technique with conjugate gradient direction (CGD). The appropriate hyperplane was constructed by Solodov and Svaiter [29], which strictly separates from the solution set of problem: The other iteration point +1 is constructed by projecting onto , which is +1 is defined by Solodov and Svaiter advice: Our method's latest search path can be calculated by Our method's latest search path can be calculated by An appealing property of the algorithm is that = −‖ ‖ 2 and it is independent of any line search technique, as shown in (4), (5), and (6). On the other hand, we have defined by our method almost holds the descent state, i.e. (3), (5), and (6) As a result, the path defined by our method typically meets the appropriate descent condition, i.e.
where = 1. The following are the steps of a new algorithm for the suggested approach.

Assumptions (3.1)
: The ( ) is bounded below and continuous on the level set R n , as well as differentiable in the level set's neighborhood N, Assumptions (3.1) imply that ∃ U > 0 such that To evaluate the global convergent, we work to determine the selection of for all generated by the new technique holds (7). In the next lemma, we show a lower bound for using the Wolfe line search.

Lemma (3.1):
Assume that algorithm (4) determines the sequence of search directions { } and { }, which means that proof: We have (8) by the specified condition. Where ℶ ≥ 1− , step 2 is represented by the Cauchy-Schwarz inequality, and step 3 is represented by the LC. Since holds (7) and > 1, (12) is satisfied. The proof is finished.
We used the Zoutendijk condition [30] with the Wolfe condition, to access the global convergent of conjugate gradient projection process. The following lemma is so important to show that the line search of proposed algorithms holds the Zoutendijk condition.

Lemma (3.2):
Assume that assumptions (10) are met, and the sequence of search directions and are determined using the suggested methodology, and the step length is measured using the Wolfe line search, then Proof From (7) for any we get: which is responsible for the second and third inequalities (7) and (12). As a result of Assumptions (10) 1 , we get (13). The proof is finished. □ As a result, we know that the iteration of (CGM) can fail, implying that‖ ‖ ≥ . ∀ ≥0, only if‖ ‖ → ∞ is sufficiently quick. To put it another way, the sequence{‖ ‖} can only be bounded away from zero if ∑ 1 ‖ ‖ 2 < +∞.

∞ =0
The global convergence of the proposed solution is now investigated in the following theorem.

Numerical Results
The suggested algorithm (NO) is compared with three other algorithms, NHS: Which produced by Mahdi M M and Shiker M A K (2020) [31].
HHT: This algorithm created by Mahdi M M and Shiker M A K (2020) [32].  , ) The tables (4.1) and (4.2) indicate that the (CPU) time, the iterations number ( ) and the functions evaluation number ( ) of the new algorithm is less and better than the results of the other three algorithms, so, the new algorithm is effective and promising.

Conclusion
In this work, we suggested a new projection technique to solve unconstrained optimization problems. The proposed method achieves global convergence with standard Wolfe line search. All of the results of the numerical experiment indicate that the new algorithm is better and more efficient than the other three algorithms that comparing with. x 0 x 1 x 2 x 3 x 4 x 5 x 6 x 7