Journal of Kirkuk University –Scientific Studies, vol.6, No.2, 2011 201 A New Non Quadratic Algorithm for Solving Non- Linear Optimization Problems Adham A. Ali College of Sciences –University of Kirkuk Accepted: 2011/6/19, Received: 2010/9/1 Abstract This paper proposes a new algorithm for non-linear optimization to modify and develop the conjugate gradient (CG) methods and to obtain a strong global convergence. This algorithm is derived and evaluated numerically against the standard (P/R and H/S)-CG algorithms and T/S algorithm using more than (20) standard well- known test functions. The numerical results show that, Non –quadratic models are very beneficial in most of the problems especially when the dimensionality of the problem increases. Introduction Conjugate gradient methods (CG) were proposed by Hestenses and Stiefel (Hestences & Stiefle, 1952) for solving systems of linear equations. The use of this method for unconstrained optimization was prompted by the fact that the minimization of a positive-define quadratic function is equivalent to solving the linear equation system that results when its gradient is set at zero. Conjugate gradient methods as applied to quadratic functions are described first. Actually, the extension of conjugate gradient methods for solving non-linear equation systems and its use in solving general unconstrained minimization problems was first done by Fletcher and Reeves (Fletcher & Revees, 1964). We will show these methods can be extended to minimize general non-linear functions. The conjugate gradient method have in general the following basic properties (Dragica Vasileska, 2006): 1) The conjugacy condition. 2) The orthogonally condition 3) The descent direction 4) The quadratic termination condition with exact line search (ELS). Concept of the extended CG-methods (ECG) A function f is defined as a non-linear scaling of the quadratic function q(x) if the following condition holds: F dq dF x q F f )), ( ( 0 and ) ( x q 0 . . . (1)