2962 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 49, NO. 12, DECEMBER 2001
Further Results and Insights on Subspace Based
Sinusoidal Frequency Estimation
Martin Kristensson, Magnus Jansson, Member, IEEE, and Björn Ottersten, Senior Member, IEEE
Abstract—Subspace-based methods for parameter identifica-
tion have received considerable attention in the literature. Starting
with a scalar-valued process, it is well known that subspace-based
identification of sinusoidal frequencies is possible if the scalar
valued data is windowed to form a low-rank vector-valued process.
MUSIC and ESPRIT-like estimators have, for some time, been
applied to this vector model. In addition, a statistically attrac-
tive Markov-like procedure for this class of methods has been
proposed. Herein, the Markov-like procedure is reinvestigated.
Several results regarding rank, performance, and structure are
given in a compact manner. The large sample equivalence with
the approximate maximum likelihood method by Stoica et al. is
also established.
Index Terms—Correlation, eigenvalues and eigenfunctions,
frequency estimation, maximum likelihood estimation, multi-
dimensional signal processing, singular value decomposition,
spectral analysis.
I. INTRODUCTION
M
ODEL-based parameter estimation using sub-
space-based methods can be an attractive alternative
to maximum likelihood estimation (MLE). In many cases,
subspace methods provide accurate estimates at a reasonable
computational cost. To apply subspace methods, a low-rank
model of the system at hand must be available. In some
cases, like in array signal processing, this structure is present
directly in the received data. In other cases, e.g., sinusoidal
frequency estimation [3], [18], [19], system identification [10],
[27], and blind channel identification [13], [26], the low-rank
vector-valued data structure can be obtained by applying
a window to the received data. Vector-valued data models
obtained from an underlying scalar valued process are, in this
paper, referred to as windowed data models.
Intuitively, the statistical properties of subspace methods
when applied to windowed data models are different from
models where the low-rank structure is physically present
in the system. In this paper, the statistical properties of sub-
space-based estimators applied to windowed data models
are explained using a subspace-based sinusoidal frequency
estimator as an example. The focus is thus not on obtaining
a new estimator but on gaining insight on the behavior of the
Manuscript received August 25 1998; revised August 30, 2001. The associate
editor coordinating the review of this paper and approving it for publication was
Prof. Lang Tong.
M. Kristensson is with Nokia Networks, Kista, Sweden (e-mail: martin.kris-
tensson@nokia.com).
M. Jansson and B. Ottersten are with the Department of Signals, Sensors,
and Systems, Royal Institute of Technology, Stockholm, Sweden (e-mail: mag-
nusj@s3.kth.se).
Publisher Item Identifier S 1053-587X(01)10486-1.
studied class of methods. The estimator is close to the algorithm
presented in [3], which was proposed as a statistically attractive
alternative to MUSIC [19] and ESPRIT [18]. Several variations
of these methods exist, and the area is still progressing; see,
e.g., [5]. For an introduction and for a more complete list of
references concerning frequency estimation, see [16] and [21].
The complicated statistical structure of the vector valued
process is evident by studying the analysis in [3]. However, we
show here that by carefully exploiting the structure, compact
expressions for the estimation error covariance can, in fact,
be obtained. In addition, these expressions enable further
analysis of the rank properties of certain weighting and residual
covariance matrices. These rank properties were left as an
open question in [3], but they are, in fact, essential when
determining optimal weighting matrices. The rank properties
also make it possible to establish the large sample equivalence
of the Markov estimator of [3] and the approximate maximum
likelihood (AML) approach in [22]. This relation shows that
the subspace approach provides the minimum asymptotic error
covariance in the class of all estimators based on a given set of
covariance estimates.
The outline of the paper is as follows. In Section II, the data
model is presented, followed by a description of the subspace-
based estimator in Section III. Next, the large sample equiv-
alence of different sample estimates of covariance matrices is
discussed. The central parts of the paper are Sections V and VI,
where the statistical results are derived. The paper is concluded
with some implementational aspects and a simulation example
in Section VII.
II. DATA MODEL AND DEFINITIONS
The samples of the scalar-valued observed signal are
assumed to be the sum of complex-valued sinusoids in additive
zero-mean white Gaussian noise
(1)
Here
real-valued amplitude;
frequency;
phase of th sinusoid.
The amplitudes and the frequencies are modeled
as deterministic quantities. The frequencies are assumed
to be distinct. The phases are uniformly distributed on
1053–587X/01$10.00 © 2001 IEEE