Contents lists available at ScienceDirect Advances in Engineering Software journal homepage: www.elsevier.com/locate/advengsoft Solution of structural and mathematical optimization problems using a new hybrid swarm intelligence optimization algorithm Ali Mortazavi a , Vedat Toğan b, , Mahsa Moloodpoor c a Civil Engineering Department, Usak University, Usak/Turkey b Civil Engineering Department, Karadeniz Technical University, 61080 Trabzon/Turkey c Mechanical Engineering Department, Ege University, Izmir/Turkey ARTICLE INFO Keywords: Integrated particle swarm optimization (iPSO) Teaching and learning based optimization (TLBO) Hybrid optimization method ABSTRACT In this investigation a new optimization algorithm named as interactive search algorithm (ISA) is presented. This method is developed through modifying and hybridizing the afrmative features of recently developed in- tegrated particle swarm optimization (iPSO) algorithm with the pairwise knowledge sharing mechanism of the teaching and learning based optimization (TLBO) method. Proposed ISA provides two diferent navigation schemes as Tracking and Interacting. Each agent based on its tendency factor can pick one of these two schemes for searching the domain. Additionally, ISA utilizes an improved fy-back technique to handle problem con- straints. The proposed method is tested on a set of mathematical and structural optimization benchmark pro- blems with discrete and continuous variables. Numerical results indicate that the new algorithm is competitive with other well-stablished metaheuristic algorithms. 1. Introduction Optimization techniques have widely been applied in the diferent felds of science and engineering to attain an optimal state for the de- sired systems. This target is usually met through the maximization or minimization of proper objective function(s) considering some specifc constraints. In the structural optimization problem, mostly the weight of the system is designated as the objective function. The main goal is to minimize this objective function, such that all constraints stay feasible and no variable bounds are violated. For example, the displacement and stress limitations are two important constraints for this class of pro- blems. To solve an optimization problem choosing an efcient method plays crucial role on the accuracy and computational time of the so- lution process. Generally, optimization methods can be categorized into two main groups: gradient based and non-gradient based techniques [1,2]. The gradient based methods require continuous objective functions and their gradients to compute the proper search direction and/or appro- priate step size. These methods have rapid convergence rate and low computational cost. However, fnding a continuous objective function for several optimization problems can be so difcult or even impossible. Also, due to their sensitivity to the starting point especially in the constrained problems with more complex search boundaries [3], they can get trapped into local minima on the more complex search spaces [4]. These shortcomings cause to limit their usage in the more com- plicated optimization problems [5]. However, the non-gradient based techniques numerically examine the search space for the optimal so- lution via progressive-stochastic evaluation of the search space. Hence, they do not require any gradient of the objective function. Especially along with the developments in computer technology these methods gain more attention among the researchers. Associated to the non-gradient methods, metaheuristic algorithms provide a mathematical model which generally inspired from a natural phenomenon like physical principles or social laws. For example, the genetic algorithm (GA) [6–8], the particle swarm optimizer (PSO) and its variants [9–13], the ant colony optimization (ACO) and its enhanced variants [14–16], the teaching and learning based optimization (TLBO) and its improved versions [17–20], the water wave optimization (WWA) [21], virus optimization algorithm (VOA) [22] and the bat in- spired algorithm (BIA) [23] are the metaheuristic methods which have widely been implemented in the structural optimization problems [24]. Although the aforementioned methods utilize diferent strategies to fnd the optimal solution, global and local search capabilities are commonly two important specifcations of a metaheuristic optimizer. Establishing a proper balance between these two search strategies leads to obtain the optimal solution with lower computational cost and higher accuracy [25,26]. Meeting this aim for metaheuristic algorithms with higher number of adjustable parameters is more difcult, since https://doi.org/10.1016/j.advengsoft.2018.11.004 Received 23 March 2018; Received in revised form 25 October 2018; Accepted 6 November 2018 Corresponding author. E-mail addresses: ali.mortazavi.php@gmail.com (A. Mortazavi), togan@ktu.edu.tr (V. Toğan). Advances in Engineering Software 127 (2019) 106–123 0965-9978/ © 2018 Elsevier Ltd. All rights reserved. T