A Hybrid Fusion Strategy for Spatial Change Detection FARHAD SAMADZADEGAN, AHMAD ABOOTALEBI, MANA NIKFAL Department of Surveying and Geomatics, University of Tehran, North Amir Abad - Faculty of Engineering -Tehran, IRAN Abstract: - Monitoring of changes in topographic urban geospatial databases is one of the main requirements of urban planners, urban decision-makers and managers. In this paper, an attempt has been made to design and develop of automatic solution for spatial change detection of objects. The approach presented here takes advantage of hybrid fusion of descriptive and logical information. That is, descriptive fusion to exploit the multi-level characteristics of the objects and logic fusion for enhancing the learning abilities of the object recognition in change detection process. The potential of the proposed methodology was evaluated based on a 1:2000 scale digital map of the city of Qom in Iran by using 1:10000 aerial photos and a pan-sharpened Quickbird scene. Visual inspection of the obtained results demonstrates the high capability of the proposed method. Key-Words: - Automatic Change Detection, Descriptive and Logical Information Fusion, Fuzzy Reasoning, Neural Network, Neuro-fuzzy, Quickbird Imagery 1 Introduction In recent years, significant attention has focused on multi-sensor data fusion to increase the capabilities of intelligent machines and systems [0]. Data fusion techniques combine data from multiple sensors and related information to achieve more specific inferences than could be achieved by using a single and independent sensor. Due to this, information fusion became an area of intense research activity in the past few years [0, 0, 0, 0, 0]. Data fusion covers a wide domain and it is difficult to provide a precise definition. Several definitions can be found in the literature [0, 0, 0, 0]. Among them, a comprehensive definition presented by Dasarthy is: "Data fusion deals with the synergistic combination of information made available by various knowledge sources such as sensors, in order to provide a better understanding of a given scene" [00]. Data fusion involves combining data to estimate or predict the state of some aspect of the universe. Often the objective is to estimate or predict the physical state of entities: their identity, attributes, activity, location and motion over some past, current, or future time period. If the job is to estimate the state of people (or any other sentient beings), it may be important to estimate or predict the informational and perceptual states of individuals and groups and the interaction of these with physical states. The concept of multi-sensor data fusion is not new. As humans and animals have evolved, they have developed the ability to use multiple senses to help them survive. For example, assessing the quality of an edible substance may not be possible using only the sense of vision; the combination of sight, touch, smell, and taste is far more effective. Similarly, when vision is limited by obstacles, the sense of hearing can provide advanced warning of impending dangers. Thus, multi-sensory data fusion is naturally performed by animals and humans to assess more accurately the surrounding environment and to identify threats, thereby improving their chances of survival. While the concept of data fusion is not new, the emergence of new sensors, advanced processing techniques, and improved processing hardware have made real-time fusion of data increasingly viable [0]. The information fusion may play a key role in geospatial information related topics. For example, in multi-agent systems, fusion of information perceived through sensors of each agent from its surrounding environment can lead to a better computational model. One of the important issues concerning information fusion is to determine how to fuse the information or data. Depending on the stage at which fusion takes place, it is often divided into three categories, namely, data level, feature level and decision level [00, 0]. In data level fusion, the combination mechanism works directly on the data Proceedings of the 2nd WSEAS International Conference on Remote Sensing, Tenerife, Canary Islands, Spain, December 16-18, 2006 85