Fusion of high spatial resolution WorldView-2 imagery and LiDAR pseudo-waveform for object-based image analysis Yuhong Zhou, Fang Qiu ⇑ Department of Geospatial Science, University of Texas at Dallas, 800 W Campbell Rd. GR31, Richardson, TX 75080-3021, USA article info Article history: Received 30 August 2014 Received in revised form 6 December 2014 Accepted 15 December 2014 Keywords: Fusion LiDAR Imagery Land cover Classification High resolution Multispectral abstract High spatial resolution (HSR) imagery and high density LiDAR data provide complementary horizontal and vertical information. Therefore, many studies have focused on fusing the two for mapping geographic features. It has been demonstrated that the synergetic use of LiDAR and HSR imagery greatly improves classification accuracy. This is especially true with waveform LiDAR data since they provide more detailed vertical profiles of geographic objects than discrete-return LiDAR data. Fusion of discrete-return LiDAR and HSR imagery mostly takes place at the object level due to the superiority of object-based image analysis (OBIA) for classifying HSR imagery. However, the fusion of the waveform LiDAR and HSR imagery at the object level has not been ade- quately studied. To fuse LiDAR waveform and image objects, the waveform for the objects derived from image segmentation are needed. However, the footprints of existing waveform are usually of fixed size and fixed shape, while those of building are of different size and shape. In order to obtain waveforms with footprints that match those of image objects, we proposed synthesizing object-based pseudo-waveforms using discrete-returns LiDAR data by utilizing count or intensity based histogram over the footprints of the objects. The pseudo-waveforms were then fused with the object-level spectral histograms from HSR WorldView-2 imagery to classify the image objects using a Kullback–Leibler divergence-based curve matching approach. The fused dataset achieved an overall classification accuracy of 97.58%, a kappa coefficient of 0.97, and producer’s accuracies and user’s accuracies all larger than 90%. The use of the fused dataset improved the overall accuracy by 7.61% over the use of HSR imagery alone, and McNemar’s test indicated that such improvement was statistically significant (p < 0.001). This study demonstrates the great potential of pseudo-waveform in improving object-based image analysis. This is especially true since currently the majority of commercial LiDAR data are of discrete return while waveform data are still not widely available. Ó 2014 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved. 1. Introduction and background Fine-scale land cover mapping is essential for a variety of appli- cations, especially in urbanized areas. Urban resource manage- ment, maintenance and planning, and pattern analysis all benefit from an accurate and detailed land cover classification. To achieve this, two emerging remote sensing techniques, high spatial resolu- tion (HSR) multispectral imagery and high density Light Detection and Ranging (LiDAR), have been more and more frequently used to develop fine-scale urban land cover maps (Zhou, 2013). The recent launch of many commercial HSR sensor systems (such as GeoEye-1, Pléiades-2, and WorldView-3) greatly improved the spatial resolution of imagery remotely sensed, with several 1– 4 m multispectral bands and a sub-meter panchromatic band. In consort with the increasing availability of HSR remote sensors, object-based image analysis (OBIA) techniques have rapidly devel- oped for fine-scale land cover mapping in the last decade (Blaschke, 2010; Berger et al., 2013; Zhou, 2013). OBIA performs image classification using image objects or segments rather than pixels as processing units. Image objects are generated through an image segmentation procedure with each segment composed of spatially adjacent pixels grouped according to some pre-defined homogeneity criteria (Blaschke, 2010). Many studies have demon- strated that OBIA approaches are superior to the pixel-based image http://dx.doi.org/10.1016/j.isprsjprs.2014.12.013 0924-2716/Ó 2014 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved. ⇑ Corresponding author. Tel.: +1 972 883 4134; fax: +1 972 883 6573. E-mail addresses: yxz102020@utdallas.edu (Y. Zhou), ffqiu@utdallas.edu (F. Qiu). ISPRS Journal of Photogrammetry and Remote Sensing 101 (2015) 221–232 Contents lists available at ScienceDirect ISPRS Journal of Photogrammetry and Remote Sensing journal homepage: www.elsevier.com/locate/isprsjprs