ORIGINAL PAPER An efficient fusion technique for quality enhancement of remotely sensed images Amr M. Ragheb & Mohammed Amoon & Hanaa Abdallah & Saleh M. Elkaffas & Tarek A. El-Tobely & S. Khamis & Mohamed E. Nasr & Fathi E. Abd El-Samie Received: 24 July 2013 /Accepted: 7 July 2014 # Società Italiana di Fotogrammetria e Topografia (SIFET) 2014 Abstract Remote-sensing satellites provide both high- resolution panchromatic and low-resolution multi-spectral im- ages. In this paper, a pixel-level multi-sensor image fusion technique is proposed for improving the spectral quality of the fused multi-spectral and panchromatic remote-sensing im- ages. The proposed fusion technique integrates both the in- tensity, hue, and saturation (IHS) and the discrete wavelet frame transform (DWFT) techniques. IHS fusion technique can provide high spatial quality and the DWFT fusion tech- nique is both aliasing free and translation invariant, but the color distortion in both techniques is noticeable. The proposed fusion technique can improve the spectral discrepancy of the fused images, while preserving the spatial quality to an ac- ceptable level. Landsat-5 (TM) with Spot (Pan), Landsat-7 (ETM+), and IKONOS panchromatic and multi-spectral im- ages have been fused using the proposed technique. The statistical analysis shows that this technique improves the fusion quality compared to the other known fusion techniques such as the conventional IHS, discrete wavelet transform (DWT), DWFT, and the integrated IHS and DWT. Keywords Image fusion . Remote sensing . IHS . DWFT . Spectral quality Introduction Image fusion combines complementary information from mul- tiple image sources to improve information content or decision making. As stated by Li et al. (2002), The complementary information about the same observed scene can be collected according to the following cases: (1) data recorded by the same sensor scanning the same scene at different dates (multi- temporal image fusion), (2) data recorded by the same sensor operating in different spectral bands (multi-spectral image fusion), (3) data recorded by the same sensor in different polarizations (multi-polarization image fusion), (4) data record- ed by the same sensor located on platforms flying at different heights (multi-resolution image fusion); and (5) data recorded by different sensors (multi-sensor image fusion). Furthermore, multi-sensor image fusion can be divided into three levels; pixel level, feature level, and decision level. In pixel level, the fusion works directly on the pixels obtained at the sensor output. On the other hand, feature-level fusion works on image features extracted from the source images. Decision-level image fusion works on merging the interpreta- tions of different images obtained after image understanding A. M. Ragheb : S. Khamis : M. E. Nasr Department of Electronics and Electrical Communications Engineering, Faculty of Engineering, Tanta University, Tanta, Egypt M. Amoon Department of Computer Science and Engineering, Faculty of Electronic Engineering, Menoufia University, Menouf 32952, Egypt H. Abdallah Faculty of Engineering, Zagazig University, Zagazig, Egypt S. M. Elkaffas College of Computing & Information Technology Arab Academy for Science, Technology & Maritime Transport, Alexandria, Egypt T. A. El-Tobely Computer and Automatic Control Department, Tanta University, Tanta, Egypt F. E. A. El-Samie (*) Department of Electronics and Electrical Communications, Faculty of Electronic Engineering, Menoufia University, Menouf 32952, Egypt e-mail: fathi_sayed@yahoo.com M. Amoon Department of Computer Science, King Saud University, Riyadh, Saudi Arabia DOI 10.1007/s12518-014-0133-0 Appl Geomat (2014) 6: – 197 205 /Published online: 2014 27 July