Enhancing Evolutionary Algorithms by Efficient
Population Initialization for Constrained Problems
Saber Elsayed
*
, Ruhul Sarker
*
, Noha Hamza
*
, Carlos A. Coello Coello
†
and Efr´ en Mezura-Montes
‡
*
School of Engineering and Information Technolog
University of New South Wales Canberra, Australia
Email: {s.elsayed,r.sarker,n.hamza
@unsw.edu.au}
†
Depto. de Computaci´ on, CINVESTAV-IPN, Mexico
Email: ccoello@cs.cinvestav.mx
‡
Artificial Intelligence Research Center
University of Veracruz, Mexico
Email:emezura@uv.mx
Abstract—One of the challenges that appear in solving
constrained optimization problems is to quickly locate the
search areas of interest. Although the initial solutions of
any optimization algorithm have a significant effect on its
performance, none of the existing initialization methods can
provide direct information about the objective function and
constraints of the problem to be solved. In this paper, a technique
for generating initial solutions is proposed, which provides
useful information about the behavior of both the objective
function and the constraints. Based on such information, an
automatic mechanism for selecting individuals, from the search
areas of interest, is introduced. The proposed method is
adopted with different evolutionary algorithms and tested on the
CEC2006 and the CEC2010 test problems. The results obtained
show the benefits of the proposed method in enhancing the
performance, and reducing the average computational time, of
several algorithms with respect to their versions adopting other
initialization techniques
Index Terms—constrained optimization problems, population
initialization, evolutionary algorithms
I. I NTRODUCTION
Many engineering, business, computer science and defense
decision processes require solving optimization problems in
the presence of constraints. Such problems are known as
constrained optimization problems (COPs). A COP may
contain different types of variables and constraints. These
problems become more challenging if they possess difficult
characteristics, such as multi-modality, high dimensionality,
and small feasible regions [1]. Formally, a COP can be
expressed as:
minimize f (
− →
x )
subject to: c
k
(
− →
x ) ≤ 0,k =1, 2,...,K
h
e
(
− →
x )=0,e =1, 2,...,E
L
j
≤x
j
≤U
j
,j =1, 2,...,D (1)
where
− →
x =[x
1
,x
2
, ..., x
D
] is a vector with D decision
variables, f (
− →
x ) the objective function, c
k
(
− →
x ) the k
th
inequality constraint, h
e
(
− →
x ) the e
th
equality constraint and
each x
j
has a lower limit (L
j
) and an upper limit (U
j
).
Over the years, the solution of COPs has attracted
a considerable amount of research. Among the currently
available approaches to deal with COPs, evolutionary
algorithms (EAs), such as genetic algorithms (GAs) [2] and
differential evolution (DE) [3], have become very popular.
Normally, the first step in such algorithms is to generate an
initial set of solutions to evolve. Due to the influence of such
solutions in the performance of an EA, a considerable number
of new initialization methods has been developed, with the
main aim of uniformly cover the search space.
The most popular technique for generating a population of
individuals is the pseudo-random number generator (PRNG)
[4] which generates a sequence of of random numbers [4],
in which the solutions are scattered according to a uniform
distribution, or to any other statistical distribution. This
initialization method is simple, but it has difficulties when
the dimensionality increases [5] because it tends to fail in
the generation of points that are fully distributed [4], [6].
Based on chaos theory [7], the chaotic number generator
(CNG) [8] has been proposed for its use with EAs [8]. Among
seven chaotic maps used with DE, the variant with the sinus
map outperformed all the other variants [8]. As a type of
space-filling method, uniform experimental design (UED) [9]
searches for points to be uniformly distributed in a given range.
However, evaluating such a large population is expensive
for both small- and large-scale problems. This shortcoming
was the motivation for introducing orthogonal design. An
orthogonal array aims to specify a set of combinations spread
uniformly over the space of all possible combinations. In
the literature, such an initialization method enhanced the
performance of several optimization approaches, such as
DE [10]. Latin hypercube sampling (LHS) [11] divides the
variables into a fixed number of intervals (creating grids)
978-1-7281-6929-3/20/$31.00 ©2020 IEEE