A Multiobjective Analysis of Adaptive Clustering Algorithms for the Definition of RBF Neural Network Centers in Regression Problems Rosana Veroneze, Andr´ e R. Gon¸calves, and Fernando J. Von Zuben School of Electrical and Computer Engineering University of Campinas Campinas, S˜ ao Paulo, Brazil {veroneze,andreric,vonzuben}@dca.fee.unicamp.br Abstract. A variety of clustering algorithms have been applied to de- termine the internal structure of Radial Basis Function Neural Networks (RBFNNs). k-means algorithm is one of the most common choice for this task, although, like many other clustering algorithms, it needs to receive the number of prototypes a priori. This is a nontrivial procedure, mainly for real-world applications. An alternative is to use algorithms that automatically determine the number of prototypes. In this paper, we performed a multiobjective analysis involving three of these algo- rithms, which are: Adaptive Radius Immune Algorithm (ARIA), Affi- nity Propagation (AP), and Growing Neural Gas (GNG). For each one, the parameters that most influence the resulting number of prototypes composed the decision space, while the RBFNN RMSE and the number of prototypes formed the objective space. The experiments found that ARIA solutions achieved the best results for the multiobjective metrics adopted in this paper. Keywords: Radial Basis Function Neural Network, Adaptive Cluster- ing Algorithms, Regression Problems. 1 Introduction Radial Basis Function Neural Networks (RBFNNs) are universal approximators and have been successfully applied to deal with a wide range of problems. The main concept in this approach is to represent the function to be approximated by a linear combination of radial basis functions (RBFs). RBFNNs can be trained by either a full or a quick learning scheme. In the former, nonlinear optimization algorithms (e.g. gradient-descent-based) are used to determine the whole set of parameters of an RBFNN: center and dispersion of each RBF, and the weights of the output layer. In this case, the number of RBFs is either defined a priori or estimated by a trial-and-error procedure. As for the quick learning scheme, the internal structure of an RBFNN (the number of RBFs, their centers and dispersions) is given a priori and the weights of the output layer can be determined by the Least Squared method. H. Yin et al. (Eds.): IDEAL 2012, LNCS 7435, pp. 127–134, 2012. c Springer-Verlag Berlin Heidelberg 2012