* Corresponding author
E-mail: omerryilmaz@gmail.com (Ö. Yılmaz)
2022 Growing Science Ltd.
doi: 10.5267/j.ijiec.2021.11.001
International Journal of Industrial Engineering Computations 13 (2022) 237–254
Contents lists available at GrowingScience
International Journal of Industrial Engineering Computations
homepage: www.GrowingScience.com/ijiec
A new hybrid algorithm based on MVO and SA for function optimization
Ömer Yılmaz
a*
, Adem Alpaslan Altun
b
and Murat Köklü
b
a
Department of Information Technologies, Tokat Vocational and Technical Anatolian High School, 60100, Tokat, Turkey
b
Department of Computer Engineering, Faculty of Technology, Konya Selcuk University, 42130, Konya, Turkey
C H R O N I C L E A B S T R A C T
Article history:
Received May 11 2021
Received in Revised Format
June 28 2021
Accepted October 22 2021
Available online
October, 27 2021
Hybrid algorithms are widely used today to increase the performance of existing algorithms. In this
paper, a new hybrid algorithm called IMVOSA that is based on multi-verse optimizer (MVO) and
simulated annealing (SA) is used. In this model, a new method called the black hole selection
(BHS) is proposed, in which exploration and exploitation can be increased. In the BHS method, the
acceptance probability feature of the SA algorithm is used to increase exploitation by searching for
the best regions found by the MVO algorithm. The proposed IMVOSA algorithm has been tested
on 50 benchmark functions. The performance of IMVOSA has been compared with other latest and
well-known metaheuristic algorithms. The consequences show that IMVOSA produces highly
successful and competitive results.
© 2022 by the authors; licensee Growing Science, Canada
Keywords:
Simulated annealing
Multi-verse optimizer
Hybrid optimization algorithm
Function optimization
1. Introduction
Optimization is defined as the process of finding the best solution among alternative solutions in line with the conditions given
for a specific problem. The basic goal of the optimization method is to find the necessary parameters for the best result of the
fitness function (Murty, 2003). Due to the tremendous recent development of information technology, the use of optimization
methods has increased. Many real-world problems can be seen as optimization problems and many algorithms have been used
to solve optimization problems. Metaheuristic algorithms are the popular algorithms that are used for solving optimization
problems.
Metaheuristic algorithms aim to examine the search space effectively and efficiently in optimization problems where the
mathematical model cannot be established or where it is very costly to build a model. Although it is not always possible to
find the best global solution with these algorithms, the convenience of their application, their ability to produce fast and
effective solutions to large-scale and complex problems, the fact that the metaheuristic method developed for any problem
can also be applied to other problems makes these methods very useful (Kaya & Fığlalı 2018; Talbi, 2009). The most important
advantage of the metaheuristic algorithm can be said to be the ability to reach the global best without getting stuck with the
local best (Laporte et al., 2000). Considering the publications, there are various metaheuristic algorithms that have been used
and accepted in many studies. Differential Evolution (DE) (Storn, 1996; Storn & Price, 1997), Ant Colony Optimization
(ACO) (Colorni et al., 1991; Jovanovic & Tuba, 2013), Artificial Bee Colony (ABC) (Karaboga, 2005), Gravity Search
Algorithm (GSA) (Rashedi et al., 2009), Cat Swarm Optimization (CSO) (Chu et al., 2006), Animal Migration Optimization
(AMO) (Li et al., 2014; Luo et al., 2016), Particle Swarm Optimization (PSO) (Kennedy & Eberhart, 1995), Simulated
Annealing (SA) (Kirkpatrick et al., 1983), Harris Hawks Optimization (HHO) (Heidari et al., 2019), Multi-verse Optimizer
(MVO) (Mirjalili et al., 2016) algorithms can be given as examples of metaheuristic algorithms.