This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS 1
Volumetric Directional Pattern for Spatial Feature
Extraction in Hyperspectral Imagery
Almabrok Essa, Student Member, IEEE, Paheding Sidike, and Vijayan Asari, Senior Member, IEEE
Abstract—In this letter, we propose to use an enhanced version
of volumetric directional pattern to efficiently extract rich spatial
context information in the hyperspectral imagery (HSI). The
proposed technique fuses the texture information from three
consecutive bands in the input HSI. The extracted local image
texture features for each pixel of interest are then fed into an
extreme learning machine classifier to assign object category.
The experimental results on three standard hyperspectral data
sets demonstrate the effectiveness of the proposed method for
HSI classification compared with that of a set of state-of-the-art
spatial extraction methods.
Index Terms— Extreme learning machine (ELM), feature
extraction, hyperspectral imagery (HSI), volumetric directional
pattern (VDP).
I. I NTRODUCTION
T
HE objective of hyperspectral imagery (HSI) classifica-
tion is to assign each pixel in HSI into a class that it
belongs to, which also termed thematic mapping. The key of
each classification task is to utilize feature extraction tech-
niques that must be able to extract pertinent features, which
are most capable of preserving object class separability under
different conditions during the image acquisition process.
Spatial information has shown significant contribution for
hyperspectral image classification. Over the last decades,
a great deal of HSI classification schemes that use spatial
features have been proposed in [2]–[4]. In [2], spatial structural
features were generated for HSI classification using morpho-
logical profile (MP). Its improved versions were developed due
to its successful performance, such as extended MP (EMP) [3]
and extended multiattribute profile (EMAP) [4]. In [5], HSI
image features are extracted by effectively utilizing band sub-
set averaging base image fusion and recursive filtering (IFRF),
which outperforms EMP-based feature extractor for HSI
classification. Chen et al. [6] effectively utilized the edge-
computation-based approach, where spatial and rotational
autocorrelations of local image gradients are obtained by gra-
dient local autocorrelations [7]. Texture information is another
useful factor that can aid in HSI classification. One of the
most successful texture descriptors is Gabor feature [8], [9].
Manuscript received December 17, 2016; revised March 1, 2017
and April 7, 2017; accepted April 12, 2017. (Corresponding author:
Almabrok Essa.)
A. Essa and V. Asari are with the Department of Electrical and Com-
puter Engineering, University of Dayton, Dayton, OH 45469 USA (e-mail:
essaa1@udayton.edu; vasari1@udayton.edu).
P. Sidike is with the Center for Sustainability, Saint Louis University,
St. Louis, MO 63108 USA (e-mail: pahedings@slu.edu).
Color versions of one or more of the figures in this letter are available
online at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/LGRS.2017.2695559
In [8], 2-D Gabor features were generated to capture the
spatial information of hyperspectral images in different scales
and orientations. Bau et al. [9] introduced the 3-D Gabor
filters model for spectral–spatial information. It generates a
set of features that captures specific-orientation-, scale-, and
wavelength-dependent properties of an HSI image region.
One of the most high performing texture algorithms based
on the concept of local pattern descriptor is local binary
pattern (LBP) [10], [11]. LBP has been applied for extracting
spatial texture features in HSI classification [12] and it yields
significantly better results compared with the other spatial-
feature-based HSI classification techniques. In this method,
the LBP code image is generated for each band in the
input HSI. To describe the spatial characteristics of the pixel,
the LBP histogram for each pixel of interest is computed with
its corresponding neighborhood region. However, this method
did not consider the texture features from the magnitude
component of the image local differences as well as the
local features from multiresolution of the image. Therefore,
Sidike et al. [13] introduced a new spatial-feature-based HSI
classification framework, which computes CLBP with multiple
scales. In their work, local structural components contain
the difference signs (i.e., original LBP) and the difference
magnitudes are combined to obtain rich textural information.
Furthermore, multiscale analysis in CLBP was used to further
improve the classification accuracy.
In this letter, we propose to modify the spatial feature
extraction method, named volumetric direction pattern (VDP)
technique, to extract the texture information from HSI. Unlike
the techniques that are mentioned above, VDP extracts the
texture features from the directional magnitude component
of three consecutive bands in HSI, to describe the spatial
characteristics of the pixel of interest from each band. Then
a histogram is built to newly represent target pixel as a 1-D
vector using its corresponding texture features. In pixelwise
classification stage, an extreme learning machine (ELM) [14]
is employed due to its efficient computation and promising
classification performance [15], [16]. Experimental results
show promising performance of modified VDP for the HSI
classification task.
The rest of the letter is organized as follows. In Section II,
a detailed description of mathematical formation of the
improved VDP technique is provided. Then the HSI classifi-
cation framework using modified VDP and ELM is illustrated
in Section III. We describe the data sets used in the exper-
iments and then present performance evaluation as well as
the comparison with state-of-the-art methods in Section IV.
Finally, conclusions are drawn in Section V.
1545-598X © 2017 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.