Journal of lntelligent and Robotic Systems 1 (1988) 103-116. 103 1988 by Kluwer Academic Publishers. Sensor Data Fusion L. F. PAU Building 348, Technical University of Denmark, DK 2800, Lyngby, Denmark (Received: 22 April 1987; revised: 9 November 1987) Abstract. This paper reviews some knowledge representation approaches devoted to the sensor fusion problem, as encountered whenever images, signals, text must be combined to provide the input to a controller or to an inference procedure. The basic steps involved in the derivation of the knowledge representation scheme, are: (A) locate a representation, based on exogeneous context information, (B) compare two representations to find out if they refer to the same object/entity, (C) merge sensor-based features from the various representations of the same object into a new set of features or attributes, (D) aggregate the representations into a joint fused representation, usually more abstract than each of the sensor-related representations. The importance of sensor fusion stems first from the fact that it is generally correct to assume that improvements in control law simplicity and robustness, as well as better classification results, can be achieved by combining diverse information sources. The second element, is that, e.g., spatially distributed sensing, or otherwise diverse sensing, does indeed require fusion as well. Key words. Artificial intelligence, sensor fusion, vision, knowledge representation, registration, sensors, signal processing. 1. Introduction In this paper will be presented a number of approaches and techniques by which multisensor data can be fused to improve the feature selection and the overall performance of a classification or interpretation system [7, 22, 24, 25]. It is generally assumed that the latter is made out by: NS separate sensing devices or knowledge sources i ~ S, i = 1 ..... NS providing pattern representations. - NF classification features, derived from the NS sensors. - NT processing and classification stages in a multilevel recognition procedure, using at most NF features. The importance of information fusion stems first from the fact that it is generally correct to assume that improvements in terms of classification error probability, rejection rate, and interpretation robustness, can only be achieved at the expense of additional independent features delivered by more separate sensors. On the other hand, as the number NS of sensors increases, the feature number NF will increase and overwhelm the recognition system in terms of data flows, and