IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 47, NO. 1, JANUARY 1999 123 Sequential Algorithms for Observation Selection Stanley J. Reeves, Senior Member, IEEE, and Zhao Zhe Abstract—Some signal reconstruction problems allow for flex- ibility in the selection of observations and, hence, the signal formation equation. In such cases, we have the opportunity to determine the best combination of observations before acquiring the data. We present and analyze two classes of sequential algorithms to select observations—sequential backward selection (SBS) and sequential forward selection (SFS). Although both are suboptimal, they perform consistently well. We analyze the computational complexity of various forms of SBS and SFS and develop upper bounds on the sum of squared errors (SSE) of the solutions obtained by SBS and SFS. I. INTRODUCTION W E CONSIDER a signal , which is a linearly trans- formed version of observed in the presence of additive noise. This signal is described by (1) where is additive noise. Suppose that , where . The goal is to reconstruct a good estimate of given the observed signal . In many applications, the matrix is known a priori, but the elements of are not. Observing the elements of may be expensive, time-consuming, or risky; in such cases, we want to limit the number of observations. This is the case in certain problems in magnetic resonance (MR) imaging and MR spectroscopic imaging (MRSI) [1]–[4]. The same concept is applicable in placing sensors in control problems [5] and remote sensing problems [6] and in determining the geometry of antenna arrays. The process is related to statistical experiment design in which the experimental data are chosen to provide the best information about the unknown regression parameters given a specific regression model [7]. We would like to observe the of elements of that provide the best possible reconstruction of using only the information from the matrix to make the selection of observations. This is equivalent to choosing the rows of that correspond to the best observations to acquire. We call this problem observation selection [8]. We must define a criterion for the optimal choice of rows and deal with the resulting combinatoric optimization problem. Manuscript received May 20, 1997; revised June 12, 1998. This work was supported by a Biomedical Engineering Research Grant from the Whitaker Foundation. The associate editor coordinating the review of this paper and approving it for publication was Dr. Phillip A. Regalia. S. J. Reeves is with the Department of Electrical Engineering, Auburn University, Auburn, AL 36849 USA (e-mail: sjreeves@eng.auburn.edu). Z. Zhe was with the Department of Electrical Engineering, Auburn Univer- sity, Auburn, AL 36849 USA. He is now with the Department of Electrical and Computer Engineering, University of Texas at Austin, Austin, TX 78712 USA. Publisher Item Identifier S 1053-587X(99)00144-0. In previous work, we derived a sum of squared errors (SSE) criterion as a function of the rows of under the assumption of zero-mean i.i.d. noise and a least-squares reconstruction [8]. We also proved that the criterion increases monotically as rows are removed from . Establishing this property allowed us to apply branch-and-bound (B&B) to the optimization problem to determine an optimal combination of observations (rows). B&B is much more efficient than exhaustive search and also yields an optimal result. However, B&B can still be com- putationally prohibitive for even moderately sized problems. Therefore, we also proposed the use of sequential backward selection (SBS). SBS sequentially eliminates one row at a time until rows remain. Although this approach is suboptimal, it eliminates the combinatoric problem and has performed consistently well in all the examples tried. However, the suboptimality of SBS raises the issue of whether a certain level of performance can be guaranteed if SBS is used. Another drawback to the SBS strategy is that none of the observations in the candidate set can be guaranteed to be in the selected set until the optimization procedure is complete. Thus, we cannot begin data acquisition until all unwanted candidate observations have been eliminated. This drawback motivates the development of a sequential forward selection (SFS) strategy. In this strategy, we sequentially select—rather than eliminate—one row at a time from the candidate matrix. Because the algorithm is committed to retain the selected observation at each step, data acquisition can begin immediately after the first observation is selected, assuming an initial set of observations can be chosen a priori. In the next section, we derive efficient implementations of the SBS and SFS algorithms and analyze the computational requirements of these implementations. In Section III, we derive upper bounds on the SSE of the SBS and SFS solutions. In Section IV, we demonstrate the algorithms and bounds with simulations. II. SEQUENTIAL ALGORITHMS A. Selection Criteria If the noise is i.i.d. and the reconstruction of is performed via least squares, we have shown [8] that the SSE in the reconstruction is proportional to tr (2) A more general form of the criterion above is given by tr (3) where is Hermitian and non-negative definite. This form covers two cases. 1053–587X/99$10.00 1999 IEEE