1 INFORMATION ASSESSMENT OF SAR DATA FOR ATR Capt. Erik P. Blasch Mike Bryant AFRL/SNAT AFRL/SNAT 241 Avionics Cir. 241 Avionics Cir. WPAFB, OH 45433 WPAFB, OH 45433 Abstract Without successful adaptive multisensor fusion or online registration techniques, automatic target recognition (ATR) algorithms are prone to poor object classifications. Multisensor fusion for a given situation assessment includes identifying measurement information for task completion an reducing image uncertainty in the presence of clutter. By extracting synthetic aperture radar (SAR) image informational features, image registration and target classification is achievable. This paper examines SAR information-theoretic features for target orientation and proposes a method for target classification. 1.0 Introduction Multisensor automatic target recognition (ATR) algorithms include target classification as a subset of sensor management. Sensor management includes selecting sensors, sensor detection and recognition policies, and classification algorithms for a given set of mission requirements [1]. For example, a typical tactical aircraft carries an onboard active radar sensor that outputs physical measurements that can be used to synthetically generate an image a Synthetic Aperture Radar (SAR) profile. SAR images can be utilized for kinematic and identity estimates to detect, recognize, identify, classify, and track objects of interest while reducing pilot workload. In a complex environment, the onboard sensor manager must select the correct sensor or sensor mode to measure the correct object at a given time. Thus, the sensor manager must control the measurement sequencing process. A sensor classification policy is best described as a problem in sequential decision making under uncertainty. Prominent elements of the problem include a knowledgeable competitor, a dynamic environment with uncertainties in target orientation and measurement clutter, and complexity arising from many possible sensor actions and outcomes [2]. From an ATR point of view, geometric target information invartiants are essential to linking features and geometry. With invariants, image properties of geometric transformations on objects such as affine transformations or sensor projections from the 3D world to the sensor output space are intrinsic to the object's structure. These invariants are not dependent on the particular choice or realization of transformation parameters. Since you can't generally control the geometric perspective of an object in the image, you can simplify the object recognition task by determining information invariants. Most invariance work as applied to computer vision has focused on geometric invariants or 3D structure properties that are invariant to the imaging geometry. Typically these are expressed as theorems on distances or angles measured on point and line sets that are constant over the pose space. Recent work has focused on photometric invariants, or invariants to the spatio-temporal object signature. Examples include color distribution and hyperspectral signature, EO/IR, or RF polarization invariants. The association analogy we explore is information features to imaging geometry for object orientation. We seek object properties, as measured by spatial/spectral intensity of a SAR output, that are not dependent on the dynamics of the signature formation process. The information can thus be used for 2D and 3D image registration [3]. This paper assesses uncertainty management for target classification. In sensor-target classification, a sensor is directed to perform a sequence of measurements that isolate a target. The challenge is to guide the sensor so it identifies the orientation of the target “efficiently”. The work reported here follows that of Blasch [1] who investigated how learning can be used for searching and detection. The methodology follows that of information theory to determine the content of a Synthetic Aperture Radar (SAR) image and the algorithm is evaluated using information-theoretic constraints. E. P. Blasch and M. Bryant “Information Assessment of SAR Data For ATR,” Proceedings of IEEE National Aerospace and Electronics Conference. Dayton, OH, pp. 414 419, July 1998. - Runner Up best paper award