Conjugate Gradient Reuse Algorithm with Dynamic Step Size Control A. N. Birkett and R. A. Goubran 1 Revision: July 10, 2003 Keywords: Conjugate Gradient, Adaptive Filters, Gradient Reuse, Dynamic Step Size ABSTRACT A new algorithm is presented which combines the Fast Conjugate Gradient algorithm (FCGA), the Modi- fied Variable Step Size (MVSS) algorithm, and gradient reuse to provide a convergence/tracking perfor- mance/complexity trade-off. The proposed algorithm reuses the estimated conjugate direction vector d k (n) at each iteration k to search for the minimum along one particular conjugate direction, and thus performs a one dimensional line search. The variable step size reduces the number of iterations necessary to reach the minimum during the line search portion. Improved convergence and tracking is obtained compared to the NLMS, RLS, FCGA and MVSS algorithms when the input data is correlated and the environment is non- stationary. By restricting the number of iterations performed during the line search, it is possible to achieve the same performance as the FCGA but using a smaller window size, and therefore reduced complexity. A simplified version of the proposed algorithm is also presented that reuses weight updates (i.e the gradient) to avoid calculating gradients and conjugate directions at every sample n. This simplified algorithm only invokes the conjugate gradient update every Pth sample resulting in an overall complexity reduction by a factor of P as compared to the FCGA. Simulation results are also presented. 1.0 INTRODUCTION The Conjugate Gradient Algorithm (CGA) has been shown to provide convergence speed comparable to the recursive least square (RLS) algorithm even when the input signal autocorrelation matrix is ill condi- tioned [1]. However, the CGA computational burden is still high compared to variations based on the Least