An Ontology-based Data Fusion Framework for Profiling Sensors Cartik Kothari 1 , Joseph Qualls 2 and David Russomanno 1 1 Purdue School of Engineering and Technology Indiana UniversityPurdue University Indianapolis, IN, U.S.A. ckothari@iupui.edu, drussoma@iupui.edu 2 Research and Engineering RenderMatrix, Inc. Memphis, TN, U.S.A. jqualls@rendermatrix.com Abstract--Data-to-decision systems must fuse information from heterogeneous sources to infer a high-level understanding of a situation. A high degree of confidence in the inferred knowledge is necessary for appropriate actions to be taken based upon the assessment of a situation. This paper presents an extensible Semantic Web compatible framework that uses rich ontological descriptions for the autonomous and human-aided fusion of heterogeneous sensors and algorithms to create evidence-based hypotheses of a situation under persistent surveillance. Raw data acquired from profiling sensors is combined with the output of visualization and classification algorithms, yielding information with a higher degree of confidence than what would be obtained without the fusion process. The framework can readily accommodate other data sources and algorithms into the fusion process. KEYWORDS: Semantic Web, Ontology, Sensor Network, Data Fusion, Situation Awareness, Data to Decision Framework, Autonomous Decision Systems I. INTRODUCTION The fusion of data from heterogeneous sources, such as sensors and intelligence reports, is integral to the inference of highly reliable, evidence-based knowledge of a situation. The degree of confidence in the inferred knowledge improves with further acquired evidence, that is, fusion of data from more sources. This paper describes an extensible Semantic Web compatible framework for the autonomous fusion of data from heterogeneous sensors and algorithms, allowing a human operator receiving the fused data to assess a situation with an increased confidence in the context of persistent surveillance. Situation awareness has been defined by Endsley [1] as a volume of time and space, the comprehension of their Situation awareness is critical to decision making in many applications, such as, patient monitoring, emergency response, military command control, and border surveillance. Systems for situation awareness require the fusion of a myriad of data and knowledge sources, including disparate sensor systems, algorithms, and intelligence reports. Semi- automated and automated inference using fused data may lead to an enhanced knowledge about the entities of interest in a situation, as well as an increased confidence in interrelationships, enabling situation awareness. End user confidence in the inferred knowledge is critical to timely and appropriate actions. Integration of sensor data with algorithmic processes and human-controlled information systems poses a significant challenge for network-centric sensor frameworks. Fig. 1 is a summary of the classical Joint Director of Laboratories (JDL) fusion levels [2], augmented with a knowledge management component [3]. These six fusion levels cover both automated The model supports the concept of autonomous algorithms and human users contributing to an evolving solution state in which fused information may enable the identification and assessment of strategies and tactics for counterintelligence [3-5]. The JDL model is useful for describing the conceptual framework within which a particular fusion process occurs and it also provides a reference for describing the level of fusion in an overall process. Fig. 1. Six levels of the data fusion model augmented with a data and knowledge management system. Multi-sensor data integration has been limited primarily by the lack of standards for data exchange and for describing sensor capabilities and specifications, which would enable their automated discovery, invocation, and composition with other sensors as part of process workflows [6-7]. The XML-based