A sensitivity analysis indicator to adapt the shift
length in a metaheuristic
Peio Loubière
CY Cergy Paris University
Pau, France
plo@eisti.eu
Astrid Jourdan
CY Cergy Paris University
Pau, France
aj@eisti.eu
Patrick Siarry
LISSI
UPEC
Vitry-sur-Seine, France
siarry@u-pec.fr
Rachid Chelouah
CY Cergy Paris University
Cergy, France
rc@eisti.eu
Abstract— Population based metaheuristics (e.g. Genetic
Algorithm, Particle Swarm Optimization, …) deal with a
dichotomy between exploration (discover unexplored areas) and
exploitation (dig around a good solution). The consequence is a
wide exploration of the search space. A lot of information about
the link between the objective function and the input variables
is collected during the algorithm. Sensitivity analysis methods
allow to transform this information in order to characterize the
effect of an input variable on the objective function: linear
impact, nonlinear impact, negligible impact. We propose to
integrate a sensitivity analysis method in the optimization
process in order to increase or decrease the shift length when
offsetting a variable according to its behavior. The offset of a
variable with a nonlinear impact has to be small in order to
catch possible local optima of the objective function. On the
contrary, the offset of a variable with a linear impact has to be
high in order to move faster the variable toward its best position.
A toy example is used to illustrate the interest of the method.
Keywords—optimization, metaheuristics, sensitivity
analysis, convergence speed
I. INTRODUCTION
Metaheuristics are strategies that guide a global
optimization process of nonlinear objective functions. The
goal is to efficiently explore the search space in order to find
near–optimal solutions. Metaheuristics deal with a dichotomy
between exploration, to discover unexplored areas, and
exploitation, to dig around a good solution. Population based
metaheuristics (e.g. genetic algorithm, particle swarm
optimization, …) ensure a wide exploration of the search
space.
Starting from a set of initial points, the metaheuristic
iterations randomly explore the neighborhood of each point.
A neighbor is generated by offsetting a set of variables of the
current point. The variables and the offset are randomly
chosen. We propose to use the information gathered during
the iterations for guiding the algorithm toward cleverer
choices. The idea is to use the points evaluations to
characterize the variables behavior (relevance and shape)
thanks to a sensitivity analysis (SA) method [2][13].
Sensitivity analysis is the study of how the input variables
affect an output variable. In optimization, SA methods are
often used to eliminate non influential variables before the
process, thereby reducing the dimension. In this paper, we
present an another way to use the information given by a SA
method for improving the convergence of the metaheuristic.
We assume that proceeding in two steps may not be suitable.
Removing variables definitively can be damaging to the
optimization process. Irrelevant variables at the beginning of
the algorithm may become relevant further in the search
process and could discriminate the points (during the
exploitation process).
Moreover, sensitivity analysis requires a lot of
evaluations of the objective function, as the metaheuristic.
Directly integrating a sensitivity analysis method in a
metaheuristic saves evaluations and allows to focus on
relevant variables. The goal is not the computation of
accurate sensibility analysis indices, but to obtain enough
information on the variable behavior in order to guide the
optimization process.
According to their search process, metaheuristics can be
split into three families: those which search along one
direction (variable), those which select a subset of variables
and those which search along all directions. For the two first
families, SA would help to focus on the most influential
variables. In all cases the information about variable behavior
(monotony, non-linearity, ...) would help adapting the offset
when generating a new neighbor.
In a previous work [9], Morris’ sensitivity method [4], has
been integrated in Artificial Bee Colony (ABC) algorithm
[4][5]. ABC algorithm integrates fairly well Morris’ method,
because of its one-direction neighborhood search process.
They both offset a point according to a single variable at time
and analyze the impact on the objective function output.
Among all search processes implemented in
metaheuristics, ABC neighborhood search is a particular
case. Many metaheuristics algorithms search a neighbor in a
hyper-sphere, offsetting various variables at a time (tabu
search [1], differential evolution DE [11], swarm intelligence
based metaheuristics such particular swarm optimization
PSO [6][12]).
In a second work [10], we generalized this approach. We
defined a new sensitivity analysis method adapted to a
multidimensional neighborhood context. This method was
successfully integrated in a second family algorithm (DE).
The SA performed during the algorithm allows to compute
for each variable a weight proportional to the impact of this
variable on the objective function. The uniform random
selection of the variables to offset is then replaced a random
selection based on the weights. The most influential variables
have more chance of being selected.
978-1-7281-6929-3/20/$31.00 ©2020 IEEE