Expert Systems With Applications 62 (2016) 317–332
Contents lists available at ScienceDirect
Expert Systems With Applications
journal homepage: www.elsevier.com/locate/eswa
A differential-based harmony search algorithm for the optimization of
continuous problems
Hosein Abedinpourshotorban
a,b
, Shafaatunnur Hasan
a,b
, Siti Mariyam Shamsuddin
a,∗b
,
Nur Fatimah As’Sahra
b
a
UTM Big Data Centre, Ibnu Sina Institute for Scientific and Industrial Research, Universiti Teknologi Malaysia, 81310 Skudai, Johor, Malaysia
b
Faculty of Computing, Universiti Teknologi Malaysia, 81310 Skudai, Johor, Malaysia
a r t i c l e i n f o
Article history:
Received 3 November 2015
Revised 4 February 2016
Accepted 7 May 2016
Available online 10 May 2016
Keywords:
Harmony search algorithm
Continuous optimization
Evolutionary optimization
Differential evolution
Meta-heuristics
a b s t r a c t
The performance of the Harmony Search (HS) algorithm is highly dependent on the parameter settings
and the initialization of the Harmony Memory (HM). To address these issues, this paper presents a new
variant of the HS algorithm, which is called the DH/best algorithm, for the optimization of globally con-
tinuous problems. The proposed DH/best algorithm introduces a new improvisation method that differs
from the conventional HS in two respects. First, the random initialization of the HM is replaced with a
new method that effectively initializes the harmonies and reduces randomness. Second, the conventional
pitch adjustment method is replaced by a new pitch adjustment method that is inspired by a Differential
Evolution (DE) mutation strategy known as DE/best/1. Two sets of experiments are performed to evaluate
the proposed algorithm. In the first experiment, the DH/best algorithm is compared with other variants of
HS based on 12 optimization functions. In the second experiment, the complete CEC2014 problem set is
used to compare the performance of the DH/best algorithm with six well-known optimization algorithms
from different families. The experimental results demonstrate the superiority of the proposed algorithm
in convergence, precision, and robustness.
© 2016 Elsevier Ltd. All rights reserved.
1. Introduction
Optimization refers to the process of selecting the best solution
from the set of all possible solutions to maximize or minimize the
cost of the problem (Moh’d Alia & Mandava, 2011). Optimization
problems can be categorized into discrete or continuous groups
based on the solution set (Velho, Carvalho, Gomes, & de Figueiredo,
2011). An additional category is based on the properties of the ob-
jective function, such as unimodal or multimodal.
Therefore, various optimization algorithms are required to
tackle different problems. There are two types of optimization al-
gorithms: exact and approximate (Stützle, 1999). Exact algorithms
are guaranteed to find the best solution within a certain period of
time (Weise, 2009). However, real world problems are mostly NP-
hard, and finding the solutions for this type of problem using exact
algorithms consumes exponential amounts of time (Johnson, 1985;
Michael & David, 1979). Thus, approximate algorithms have been
∗
Corresponding author. Tel.: 6075531993; Fax: 6075565044.
E-mail addresses: h_abedinpour@gmail.com (H. Abedinpourshotorban),
shafaatunnur@utm.my (S. Hasan), mariyam@utm.my (S.M. Shamsuddin),
fatim.assahra@gmail.com (N.F. As’Sahra).
applied recently to find near-optimal solutions to NP-hard prob-
lems in reasonable amounts of time.
Meta-heuristics are approximate algorithms that are able to
find satisfactory solutions for optimization problems in reasonable
amounts of time (Blum & Roli, 2003, 2008). Meta-heuristics are
also used to address a major drawback of approximate local search
algorithms, which is finding local minima instead of global min-
ima.
Differential Evolution (DE) (Price, Storn, & Lampinen, 2006;
Storn & Price, 1995, 1997) emerged in the late 1990 s and is one of
the most competitive metaheuristic algorithms. The DE algorithm
is somewhat similar to the Genetic Algorithm (GA), but the so-
lutions consist of real values instead of binary values and gener-
ally converge faster than the GA (Hegerty, Hung, & Kasprak, 2009).
The performance of DE greatly depends on the parameter set-
tings (Islam, Das, Ghosh, Roy, & Suganthan, 2012). Many variants
of DE have been proposed to address different problems, but DE
still faces several difficulties in optimizing some types of functions
as has been pointed out in several recent publications (Hansen &
Kern, 2004; Ronkkonen, Kukkonen, & Price, 2005). However, due
to the optimization power of the DE algorithm, it is commonly
applied for the optimization of real world problems, such as op-
timizing compressor supply systems (Hancox & Derksen, 2005),
http://dx.doi.org/10.1016/j.eswa.2016.05.013
0957-4174/© 2016 Elsevier Ltd. All rights reserved.