2398 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 45, NO. 9, SEPTEMBER 1997 [7] S. Haykin, Adaptive Filter Theory. Englewood Cliffs, NJ: Prentice- Hall, 1991. [8] G. J. Bierman, Factorization Method for Discrete Sequential Estimation. New York: Academic, 1977. [9] P. Eykhoff, Trends and Progress in System Identification. New York: Pergamon, 1981. [10] G. H. Golub and C. F. Van Loan, Matrix Computations, 2nd ed. Baltimore, MD: Johns Hopkins Univ. Press, 1989. The Barankin Bound and Threshold Behavior in Frequency Estimation Luc Knockaert Abstract— This correspondence presents the Barankin bound as a fundamental statistical tool for the understanding of the threshold effect associated with the estimation of the frequency of a sinusoid in additive white Gaussian noise. It is shown that the threshold effect takes hold whenever the Barankin bound departs significantly from the Cramer–Rao bound. In terms of the signal-to-noise ratio (SNR) and the data length , the quantity SNR is shown to be a good indicator for deciding whether the SNR is above threshold or not. I. INTRODUCTION The problem of estimating the frequency of a sinusoid in additive white Gaussian noise is one of considerable interest. In most cases [1], [2], the maximum likelihood (ML) procedure is utilized to obtain what one expects to be a sufficiently unbiased and efficient estimator of the frequency. Due to the nonlinear nature of the frequency estimation problem, the so-called threshold effect [2], [3] takes hold whenever the SNR drops below a critical data-length dependent level SNR . The threshold effect can be characterized by an almost instant and drastic deterioration of the frequency estimator variance with respect to the Cramer–Rao bound (CRB) below this critical SNR level. In [2], the threshold effect was related to the existence of highly probable outliers that were fairly removed from the exact frequency, and in [3], a more technical device related to the phase- locked loop was proposed to explain the phenomenon. The aim of this correspondence is to provide a more fundamental approach to the understanding of the threshold effect. Our starting point is the fact that the CRB, although being the best lower bound in the linear Gaussian case, is a less appropriate tool when dealing with nonlinear problems such as frequency estimation. For nonlinear problems, the Barankin bound (BRB) [4], [5] is a stronger lower bound for the variances of unbiased estimators, including the CRB as a limit case. The threshold effect region can therefore, in the sense of Barankin, be defined as the region where the BRB suddenly departs from the CRB. This is fully exploited in the sequel, resulting in a simple indicator quantity for threshold behavior in frequency estimation. Manuscript received March 13, 1996; revised March 27, 1997. The associate editor coordinating the review of this paper and approving it for publication was Dr. Ananthram Swami. The author is with the Department of Information Technology, Intec, Gent, Belgium (e-mail: knokaert@intec.rug.ac.be). Publisher Item Identifier S 1053-587X(97)06438-6. II. THE BARANKIN BOUND The simplest form of the BRB for the estimation of a scalar real parameter can be stated as follows [5]. Let be the probability density of the vector , given . Let be a real number independent of such that ranges over all possible values of . Then for any unbiased estimator , we have var BRB (1) where BRB CRB (2) and the CRB is given by CRB var (3) To avoid theoretical complications, we assume that the integral in the denominator of (2) exists and that the support of and its partial derivative with respect to is for almost all . A natural way to measure the deviation of the BRB from the CRB is the ratio BRB CRB (4) where var (5) When , then the supremum in (5) is obtained for , and in that case, we say that there is no Barankin threshold effect. This does not imply that there is no threshold effect whatsoever since there exist still stronger bounds than the above BRB [4], [5]. When , then the supremum is obtained for , and in that case, there surely exists a threshold effect since the BRB and, hence, the variance of the estimator then depart from the CRB. To show what this means in practice, we apply this to a simple but frequently occurring nonlinear problem. Let the observed data vector be given by (6) where is Gaussian noise, and is a function, in general nonlinear, mapping the parameter into the data space. After some elementary calculations, we obtain (7) Note that when the problem is linear, i.e., when is a constant vector, is a strictly decreasing function of , which implies that . This is easily understood since linear problems in additive Gaussian noise never exhibit a threshold effect. Note also that when we have independent realizations of the same process, the above formula remains valid after replacing with . 1053–587X/97$10.00 1997 IEEE