AbstractMLP which is one of the most commonly used classifier, is a feed-forward, supervised neural network topology. Back propagation algorithm is used for minimizing the error between network output and the target value. According to classification process, MLP structure and learning parameters, which are used in back propagation algorithm, are needed to decide for increasing the test accuracy. Commonly these variables are chosen randomly, so finding the values that give maximum test accuracy is a time- consuming process. In this paper, learning parameter of back propagation algorithm and network structure are optimized to success a faster and efficient weight-update process by using three different heuristic optimization algorithm, ABC, GA and SA. Both of the used two datasets contain human activity sensor data. For two datasets, three algorithms are compared and detailed test results are given. It is observed that, although SA is the fastest one among chosen algorithms, ABC shows the highest performance for test accuracy of MLP classifier. Keywordsmulti-layer perceptron, artificial bee colony, genetic algorithm, simulated annealing algorithm, human activities classification. I. INTRODUCTION HE main drawback of the neural networks is the decision of the parameters via trial and error, or approximately. If learning rate in the Back Propagation Algorithm is chosen bigger, learning becomes faster but the risk of oscillation appears. But, if learning rate is small, learning process takes very long time. Same situation is also observed in Momentum Coefficient. Hidden neuron number is determined totally random. Selection of different activation function for each problem solution can give better solution according to output characteristic. Because of reaching the decision of all these values is time consuming, the development of an optimization algorithm is necessary. There are various studies in the literature with Artificial Bee Colony(ABC),Simulated Annealing Algorithm(SA) and Genetic Algorithm(GA) regarding the cases, in which one parameter is optimized while all the others are held constant[1,2].There are limited number of studies which Murat Taşkıran, Zehra Gülru Çam, and Nihan Kahraman, are with Yıldız Technical University Department of Electronics and Communication Engineering optimizes more than one parameter by using one algorithm[3]. In addition to these studies, Çam, Çimen and Yıldırım’s work optimizes 3 parameters, hidden neuron number, learning rate and momentum coefficient, of back propagation algorithm[4]. In this study, we will be adding type of the activation function as the fourth parameter. The organization of this paper is as follows: The second section contains a short review of multi- layer perceptron and back propagation. The third section focus on the optimization algorithms used in this study. In the fourth section, the datasets are introduced and optimization process is explained. Finally in the last section, results are given. II. MULTI-LAYER PERCEPTRON Multi-Layer Perceptron(MLP) is an essential tool for solving classification, identification and generalization problems. A basic three layer MLP topology is given in figure 1. Fig. 1: A basic topology of MLP Each neuron in the first layer, takes features of each sample, generates a weighted summation and gives this summation to an activation function as the function variable. Outputs of activation functions are outputs of the neurons. According to problem, a suitable activation function is chosen. Number of input neurons are determined with input feature vector size and number of output neurons are determined according to number of classes. However number of hidden layer and neuron number of hidden layers are chosen generally with trial and error methods by designer. This artificial neural network model is using back propagation algorithm to minimize the squared error between networks output and target values. Learning process updates weights in each iteration with equation 1. An Efficient Method to Optimize Multi-Layer Perceptron for Classification of Human Activities Murat Taşkıran, Zehra Gülru Çam and Nihan Kahraman T Int'l Journal of Computing, Communications & Instrumentation Engg. (IJCCIE) Vol. 2, Issue 2 (2015) ISSN 2349-1469 EISSN 2349-1477 http://dx.doi.org/10.15242/IJCCIE.ER1215104 191