Copyright e IFAC System Identification, Copenhagen, Denman, 1994 ON COVARIANCE MODIFICATION AND RE- GULARIZATION IN RECURSIVE LEAST SQUA- RES IDENTIFICATION S.GUNNARSSON Department of Electrical Engineering, Linkoping Univer$ity, S-58183 Linkoping, Sweden Abstract. In this paper the relationships between covariance modification and regularization in recursive least squares identification are investigated. An update equation for the information matrix is derived and it is shown how regularization of the information matrix can be expressed as a partic- ular type of cO\'ariance matrix modification. The paper also presents an analysis of the effects of a covariance modification for obtaining regularization that was proposed in Salgado et al., 1988. Key Words. Recursive least squares identification, tracking, covariance modification, regulariza- tion. and t:(t) denotes the prediction error K(t) = o(t)R-1(t)tp(t) (4) t:(t) = y(t) - tpT(t)B(t - 1) (5) 2. THE RECURSIVE LEAST SQUARES ALGORITHM (1) y(t) = tpT(t)O + v(t) Furthermore R(t) denotes the information ma- trix t R(t) = Lo(k)<p(k)<pT(k) (6) k=l t Vt(O) = L o(k)(y(k) - tpT(k)O)2 (2) k=l with respect to O. As shown in, for example, Ljung and SOderstrom, 1983 the criterion (2) is minimized by B(t) which is updated according to B(t) = B(t - 1) + K(t)t:(t) (3) where the gain vector K(t) is given by Consider the problem of estimating the param- eter vector 0 in the linear regression Here y(t) denotes the measured output signal and v(t) is a disturbance signal while the re- gression vector tp(t) contains delayed versions of the input signal u(t) and the output signal y(t). A standard method for obtaining an estimate of the parameter vector is to minimize a weighted least squares criterion 1. INTRODUCTION The recursive least squares (RLS) algorithm is a useful tool in automatic control and signal pro- cessing for identifying systems and signals with time varying character. See e.g. Ljung and Soderstrom, 1983 for an introduction. In order for the algorithm to maintain its tracking abil- ity it is however necessary to have a mechanism that prevents the algorithm gain from tending to zero. Such mechanisms involve the use of, for ex- ample, forgetting factor or covariance modifica- tion, see e.g. Goodwin and Sin, 1984. These dif- ferent methods however introduce a new prob- lem since the algorithm becomes sensitive for poor excitation in the input signal. Running the RLS algorithm with a forgetting factor im- plies that periods with poor excitation can result in an exponential growth of the covariance ma- trix in the algorithm. This phenomenon, which sometimes is denoted estimator windup, is dis- cussed in e.g. Astrom and Wittenmark, 1989. This problem makes is necessary to introduce some safety action that handles this situation. One method that has been proposed, see Astrom and Wittenmark, 1989, is to monitor the trace of the covariance matrix in the algorithm, and if necessary normalize this matrix. Another alter- native is to concentrate on the information ma- trix, i.e. the inverse of the covariance matrix, and prevent it from becoming singular. This is achieved using so called regularization and this is the focus of this paper. Of particular inter- est will be a method proposed in Salgado et al., 1988. 935