Supervised Learning with Kernel methods TAOUALI OKBA, ELAISSI ILYES, GARNA TAREK AND MESSAOUD HASSANI Unité de Recherche ATSI Ecole Nationale d’Ingénieurs de Monastir Rue Ibn El Jazzar 5019 Monastir ; Tel : +(216) 73 500511, Fax : +(216) 73 500 514, Tunisia taoualiokba@yahoo.fr, ilyes.elaissi@yahoo.fr, tarek.garna@enim.rnu.tn, hassani.messaoud@enim.rnu.tn Abstract: -This paper proposes a comparative study of three identification kernel methods of nonlinear systems modelled in Reproducing Kernel Hilbert Space (RKHS), where the model output results from a linear combination of kernel functions. The coefficients of this combination are the model parameters, the number of which equals the number of observations used in learning phase. Theses methods are support vector machines (SVM), regularization networks (RN) and kernel Principal Component Analysis (KPCA). The performances of each method in terms of generalization ability and computing time were evaluated on numerical simulations. Key-Words: - Identification, RKHS, SLT, SVM, RN, 1 Introduction The last few years has registered the birth of a new modelling technique of nonlinear systems developed on a particular Hilbert Space, known as Reproducing Kernel Hilbert Space (RKHS) which uses the statistical learning theory (SLT) to provide an RKHS model as a linear combination of the kernels forming this space [4], [6], [7], [12], [13] and [14]. The developed models are an attractive alternative to other modeling techniques based on Volterra series, neural networks, . Indeed, the solution of optimization problem in space RKHS is a global minimum contrary to that provided by neural networks. The solution is obtained by solving a quadratic optimization problem by using the learning algorithms such as support vector machines (SVM) [1], regularization networks (RN) [2] and Kernel Analysis Principal Component KPCA [9]. These algorithms known as kernel methods construct RKHS models on the principle of structural risk minimization (SRM). The number of parameters of these models depends only on the number of observations and not on the structure model as in the conventional modeling approaches. The paper is organized as follows. In section 2 we remind the presentation of the RKHS space. Section 3 is devoted to the modelling in RKHS. The SVM, RN and KPCA methods are presented in the section 4 and then tested to identify a benchmark in the fifth section [11]. 2 Reproducing Kernel Hilbert Space Let d E an input space and ( 29 2 E L the Hilbert space of square integrable functions defined on E . Let : k E E × be a continuous positive definite kernel. It is proved [15] that it exists a sequence of an orthonormal eigen functions ( 29 1 2 , , ..., l ψ ψ ψ in ( 29 2 E L (where l can be infinite) and a sequence of corresponding real positive eigenvalues ( 29 1 2 , , ..., l σ σ σ so that the kernel k is defined as: ( 29 ( 29 (29 1 , ; , l j j j j kxt x t xt E σ ψ ψ = = (1) Let ( 29 2 k E F L be a Hilbert space associated to the kernel k and defined by: ( 29 2 2 1 1 / l l j k i i i j j w F f L E f w and ϕ σ = = = = <+∞ (2) Where ; 1, ..., i i i i l ϕ σψ = = . The scalar product in the space k F is given by: 1 1 1 , , k k l l l F i i j j F i i i j i f g w z wz ϕ ϕ = = = < > =< > = (3) The kernel k is said to be a reproducing kernel of the Hilbert space k F if and only if the following conditions are satisfied. NON-LINEAR SYSTEMS and WAVELET ANALYSIS ISSN: 1790-2769 73 ISBN: 978-960-474-189-2