Differential Evolution Enhanced by Neighborhood Search
Hui Wang, Zhijian Wu and Shahryar Rahnamayan
Abstract— This paper presents a novel Differential Evolution
(DE) algorithm, called DE enhanced by neighborhood search
(DENS), which differs from pervious works of utilizing the
neighborhood search in DE, such as DE with neighborhood
search (NSDE) and self-adaptive DE with neighborhood search
(SaNSDE). In DENS, we focus on searching the neighbors
of individuals, while the latter two algorithms (NSDE and
SaNSDE) work on the adaption of the control parameters F
and CR. The proposed algorithm consists of two following main
steps. First, for each individual, we create two trial individuals
by local and global neighborhood search strategies. Second, we
select the fittest one among the current individual and the two
created trial individuals as a new current individual. Experi-
mental studies on a comprehensive set of benchmark functions
show that DENS achieves better results for a majority of test
cases, when comparing with some other similar evolutionary
algorithms.
Index Terms— Differential evolution, neighborhood search,
local search, global optimization.
I. I NTRODUCTION
Differential Evolution (DE), proposed by Price and Storn
[1], is an effective, robust, and simple global optimization
algorithm. According to frequently reported experimental
studies, DE has shown better performance than many other
evolutionary algorithm (EAs) in terms of convergence speed
and robustness over several benchmark functions and real-
world problems [2].
Since the development of DE, many improved versions
have been proposed. Based on the improved mechanisms,
we can divide them into three categories as follows.
1) Adaptive Parameter Control: The classical DE algo-
rithm only has three control parameters N
p
(population
size), CR and F , which greatly affect performance of
DE. The values of these parameters highly determine
the quality of the obtained solution and the efficiency
of the search [3]. Choosing appropriate parameter val-
ues is a problem dependent task and requires previous
experience and knowledge of the user. To tackle this
problem, some adaptive parameter control strategies
have been proposed, such as fuzzy DE (FADE) [4]
self-adaptive DE (SaDE) [5], [6], self-adapting control
parameters in DE (jDE) [3], DE with neighborhood
search (NSDE) [7] and self-adaptive DE with neigh-
borhood search (SaNSDE) [8].
Hui Wang and Zhijian Wu are with the State Key Laboratory of
Software Engineering, Wuhan University, Wuhan, 430072 China (e-mail:
wanghui cug@yahoo.com.cn; zjwu9551@sina.com).
Shahryar Rahnamayan is with Faculty of Engineering and Ap-
plied Science, University of Ontario Institute of Technology (UOIT),
2000 Simcoe Street North, Oshawa, ON L1H 7K4, Canada (e-mail:
shahryar.rahnamayan@uoit.ca).
2) Modified Mutation Strategies: The DE algorithm has
two important operators (besides the selection), muta-
tion and crossover. The former is determined by the
mutation strategies, and the latter is dominated by
the crossover probability CR and crossover strategy.
Besides the improvement of the control parameters,
some modifications of the mutation strategies could
also improve the performance of DE. DE/current-to-
pbest [9] and DE/target-to-best/1 [10] are some exam-
ples among others.
3) Hybrid Strategies: Recently, some new works have
been introduced by combining the classical DE with
the concepts of machine learning and some successful
search techniques. These new improved DE variants
are called hybrid DE, such as opposition-based DE
(ODE) [11], [12], [13], [14], DE with adaptive local
search (DEahcSPX) [15], and DE based on generalized
opposition-based learning (GODE) [16].
In this paper, we present a novel DE algorithm, called DE
enhanced by neighborhood search (DENS), to improve the
performance of the standard DE. In order to verify the perfor-
mance of DENS, current work provides a comparative study
of DENS and other similar DE variants on a comprehensive
set of benchmark functions.
The rest of the paper is organized as follows. In Section II,
the classical DE algorithm is briefly reviewed. The proposed
approach, DENS, is presented in Section III. In Section IV,
the test functions, parameter settings and the comparison of
DENS with other similar algorithms are provided. Finally,
the work is summarized and concluded in Section V.
II. A BRIEF REVIEW OF DIFFERENTIAL EVOLUTION
DE is a population-based stochastic search algorithm, and
has been successfully applied to solve complex problems
including linear and nonlinear, unimodal and multimodal
functions. It has been investigated that DE is faster and more
robust on majority of functions than many other evolutionary
algorithms [2].
There are several variants of DE [1], where the most
popular variant is indicated by “DE/rand/1/bin” which
is called classical version. The proposed algorithm is
also based on this DE scheme. Let us assume that
X
i,G
(i =1, 2,...,N
p
) is the ith individual in population
P (G), where N
p
is the population size, G is the generation
index, and P (G) is the population in the Gth generation.
The main idea of DE is to generate trial vectors. Mutation
and crossover are used to produce new trial vectors,
and selection determines which of the vectors will be
successfully selected into the next generation.
978-1-4244-8126-2/10/$26.00 ©2010 IEEE