Mathematics and Statistics 10(3): 610-614, 2022 http://www.hrpub.org
DOI: 10.13189/ms.2022.100317
A Descent Conjugate Gradient Method With Global
Converges Properties for Non-Linear Optimization
Salah Gazi Shareef
Department of Mathematics, Faculty of Science, University of Zakho, Kurdistan Region, Iraq
Received February 14, 2022; Revised April 22, 2022; Accepted May 23, 2022
Cite This Paper in the following Citation Styles
(a): [1] Salah Gazi Shareef , "A Descent Conjugate Gradient Method With Global Converges Properties for
Non-Linear Optimization," Mathematics and Statistics, Vol. 10, No. 3, pp. 610 - 614, 2022. DOI:
10.13189/ms.2022.100317.
(b): Salah Gazi Shareef (2022). A Descent Conjugate Gradient Method With Global Converges Properties for
Non-Linear Optimization. Mathematics and Statistics, 10(3), 610 - 614. DOI: 10.13189/ms.2022.100317.
Copyright©2022 by authors, all rights reserved. Authors agree that this article remains permanently open access under the
terms of the Creative Commons Attribution License 4.0 International License
Abstract Iterative methods such as the conjugate
gradient method are well known methods for solving
non-linear unconstrained minimization problems partially
because of their capacity to handle large-scale
unconstrained optimization problems rapidly, and partly
due to their algebraic representation and implementation in
computer programs. The conjugate gradient method has
wide applications in a lot of fields such as machine learning,
neural networks and many other fields. Fletcher and
Reeves [1] expanded the approach to nonlinear problems in
1964. It is considered to be the first nonlinear conjugate
gradient technique. Since then, lots of new other conjugate
gradient methods have been proposed. In this work, we
will propose a new coefficient conjugate gradient method
to find the minimum of the non-linear unconstrained
optimization problems based on parameter of Hestenes
Stiefel. Section one in this work contains the derivative of
new method. In section two, we will satisfy the descent and
sufficient descent conditions. In section three, we will
study the property of the global convergence of the new
proposed. In the fourth section, we will give some
numerical results by using some known test functions and
compare the new method with Hestenes S. to demonstrate
the effectiveness of the suggestion method. Finally, we will
give conclusions.
Keywords Non-Linear Minimization, Algorithm of
Conjugate Gradient, Descent property and Global
Convergence Property
1. Introduction
Below is the nonlinear unconstrained minimization
problem, consider it:
Min. () ; ∈
(1.1)
where ∶
→ is a continuously differentiable,
real-valued function. We used the iterative method
+1
=
+
(1.2)
for solving the problem (1.1), and the iterative method is
starting with an initial guess
1
belongs to
, where
=
=
+1
−
, the positive step length
is
computed by one dimensional line search and
is search
direction. The search direction of the steepest descent, has
the form
1
= −
1
(1.3)
The equation below is to calculate the next search
directions:
+1
= −
+1
+
(1.4)
Where,
= ∇ (
) and
is a scalar. The basic
parameters of
, Hestenes S. (HS) [2], Polak R. Polyak
(PRP) [3], Fletcher R.(FR) [4], Dai and Yuan (DY) [5], Dai
and Liao [6], Perry [7], and Liu andStorey [8], which are
shown below:
=
+1
(
+1
−
)
(
+1
−
)
(1.5)
=
+1
(
+1
−
)
‖
‖
2
(1.6)