ADV MATH SCI JOURNAL Advances in Mathematics: Scientific Journal 9 (2020), no.7, 4955–4970 ISSN: 1857-8365 (printed); 1857-8438 (electronic) https://doi.org/10.37418/amsj.9.7.61 A NEW MODIFICATION OF NPRP CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION MAULANA MALIK 1 , MUSTAFA MAMAT, SITI S. ABAS, IBRAHIM M. SULAIMAN, AND SUKONO ABSTRACT. The conjugate gradient method is among the efficient method for solving unconstrained optimization problems. In this paper, we propose a new formula for the conjugate gradient method based on the modification of the NPRP formula (Zhang, 2009). The proposed method satisfies the sufficient descent condition, and global convergence proof was established under some assumptions and strong Wolfe line search. Numerical results based on 98 test problems show that the new method very efficient as compared with the classi- cal conjugate gradient method. 1. I NTRODUCTION We consider the following unconstrained optimization problems (1.1) min {f (x)|x ∈ R n } where f : R n → R is continuous and differentiable function. The conjugate gra- dient method is of an iterative method to solving (1.1) with formula as follows: (1.2) x k = x k + α k d k ,k =0, 1, 2, ..., where x 0 is initial point, x k is the point in kth iterative, d k is the search direction, and α k is the stepsize [1]. There are numerous methods used for calculation the 1 corresponding author 2010 Mathematics Subject Classification. 49M37, 65K10, 90C06. Key words and phrases. Conjugate gradient method, Sufficient descent condition, Global con- vergence, Strong Wolfe line search. 4955