Wolf Search Algorithm with Ephemeral Memory
Rui Tang, Simon Fong
Department of Computer and
Information Science
University of Macau
Taipa, Macau SAR
ccfong@umac.mo
Xin-She Yang
Mathematics and Scientific Computing
National Physical Laboratory
Teddington, UK
xin-she.yang@npl.co.uk
Suash Deb
Department of Computer Science &
Engineering
C. V. Raman College of Engineering
Bidyanagar, India
suashdeb@gmail.com
Abstract—In computer science, a computational challenge
exists in finding a globally optimized solution from a
tremendously large search space. Heuristic optimization methods
have therefore been created that can search the very large spaces
of candidate solutions. These methods have been extensively
studied in the past, and progressively extended in order to suit a
wide range of optimization problems. Researchers recently have
invented a collection of heuristic optimization methods inspired
by the movements of animals and insects (e.g., Firefly, Cuckoos,
Bats and Accelerated PSO) with the advantages of efficient
computation and easy implementation. This paper proposes a
new bio-inspired heuristic optimization algorithm called the Wolf
Search Algorithm (WSA) that imitates the way wolves search for
food and survive by avoiding their enemies. The contribution of
the paper is twofold: 1. for verifying the efficacy of the WSA the
algorithm is tested quantitatively and compared to other
heuristic algorithms under a range of popular non-convex
functions used as performance test problems for optimization
algorithms; 2. The WSA is investigated with respective to its
memory requirement. Superior results are observed in most tests.
Index Terms—Metaheuristic; Bio-inspired Optimization; Wolf
Search Algorithm
I. INTRODUCTION
An optimization problem generally aims to find x
opt
=
where
is the search space and f(x) is a
fitness function measuring the goodness of the solution. The
global optimum represents a best solution x
opt
that is assumed
to exist in the problem space. In many real-life applications, the
optimization functions may not behave well mathematically
and it is a well-known challenge in searching for a global
optimal solution.
Fig. 1. Griewank Function with it local minima and a global minimum. Left:
zoom-out, x=[-600, 600], Right: zoom-in, x=[-150, 150]
For example, the Griewank function that is shown in Figure
1, has been commonly used to test the convergence of
optimization algorithms because its number of local minima
grows exponentially as its number of dimensions increases;
while its single global minimum is located at x=0. In such
cases where the global optimum is hard to find, especially
when the data carry high dimensional variables, the
optimization problems can be complex and the problem sizes
may thwart efficient calculation. For instance, in the travelling
salesman problem, the search-space of candidate solutions
grows more than exponentially as the size of the problem
increases, which makes an exhaustive search for the optimal
solution infeasible. A heuristic optimization method is a
heuristic strategy for searching the search space of an
ultimately global optimum in a more or less intelligent way [1].
This is also known as a stochastic optimization. A stochastic
optimization is grounded in the belief that a stochastic, high-
quality approximation of a global optimum obtained at the best
effort will probably be more valuable than a deterministic,
poor-quality local minimum provided by a classical method or
no solution at all. Incrementally, it optimizes a problem by
attempting to improve the candidate solution with respect to a
given measure of quality defined by a fitness function. It first
generates a candidate solution x
candidate
and as long as the
stopping criteria are not met, it checks its neighbors against the
current solution (SELECT
). The
candidate solution is updated with its neighbor if it is better (IF
f(x
neighbor
) < f(x
candidate
) THEN x
candidate
= x
neighbor
), such that the
global optimum at the end is x
opt
= x
candidate
. As such, heuristic
optimization algorithms are often based on local search
methods in which the solution space is not explored
systematically or exhaustively, but rather a particular heuristic
is characterized by the manner in which the exploration
through the solution space is organized. The authors Yang and
Deb have recently invented a collection of bio-inspired
metaheuristic algorithms, including Firefly [2], Cuckoos [3],
Bats [4] and Accelerated PSO [5]. These bio-inspired heuristic
optimization algorithms have search methods both in breath
and in depth that are largely based on the swarm movement
patterns of animals and insects found in nature. Their
performance in heuristic optimizations have proven superior to
that of many classical metaheuristic methods [9, 10] (e.g.
genetic algorithms, simulated annealing, Tabu search, etc).
978-1-4673-2430-4/12/$31.00 ©2012 IEEE
165