Support vector regression of membership functions and belief functions – Application for pattern recognition Hicham Laanaya a,b, * , Arnaud Martin b , Driss Aboutajdine a , Ali Khenchaf b a GSCM-LRIT, Mohammed V-Agdal University, Faculty of Sciences of Rabat, Morocco b ENSIETA-E 3 I 2 -EA3876, 2, rue François Verny 29806 Brest Cedex 9, France article info Article history: Received 25 May 2007 Received in revised form 21 December 2009 Accepted 22 December 2009 Available online 4 January 2010 Keywords: SVR SVM Regression Belief functions Membership functions abstract Caused by many applications during the last few years, many models have been proposed to represent imprecise and uncertain data. These models are essentially based on the theory of fuzzy sets, the theory of possibilities and the theory of belief functions. These two first theories are based on the membership functions and the last one on the belief functions. Hence, it could be interesting to learn these member- ship and belief functions from data and then we can, for example, deduce the class for a classification task. Therefore, we propose in this paper a regression approach based on the statistical learning theory of Vapnik. The membership and belief functions have the same properties; that we take as constraints in the resolution of our convex problem in the support vector regression. The proposed approach is applied in a pattern recognition context to evaluate its efficiency. Hence, the regression of the membership functions and the regression of the belief functions give two kinds of clas- sifiers: a fuzzy SVM and a belief SVM. From the learning data, the membership and belief functions are generated from two classical approaches given respectively by fuzzy and belief k-nearest neighbors. Therefore, we compare the proposed approach, in terms of classification results, with these two k-nearest neighbors and with support vector machines classifier. Ó 2009 Elsevier B.V. All rights reserved. 1. Introduction The study of uncertain environments using more and more complex systems is necessary in many applications. We must then evaluate these systems from uncertain and imprecise data. Several choices are suitable to tackle these imperfections (uncertainty and imprecision): either we try to remove them, which requires a understanding of the physics of the sensors used for data acquisi- tion or we seek to develop a robust system to deal with these imperfections or again we try to model them. A precise model of uncertain and imprecise data can be carried out using the theories of uncertainty such as the fuzzy sets theory [1], the theory of possibilities [2,3] or the theory of belief functions [4,5]. These theories of uncertainty are based either on member- ship functions or on belief functions in order to represent impre- cise and uncertain data. Hence, these functions are used in many applications dealing with uncertain environments such as pattern recognition, clustering, data mining, assessment, tracking, control, etc (for some reviews of such applications see for example [6,7] for the fuzzy sets theory and [8,9] for the theory of belief functions). Consequently, it could be interesting for a lot of applications to learn these membership and belief functions from data. Therefore, a regression approach based on the statistical learning theory of Vapnik [10,11] is proposed to learn membership and belief func- tions. The goal of the paper is not to propose a new fuzzy regres- sion [12–14] or belief regression [15]. Support vector machines (SVM), introduced by Vapnik [10,11], are first a binary classification method. The simplicity of this ap- proach and its capacity of extension to non linear classification in- volved significant development and wide use of SVM, particularly in linear regression [16,17]. This is the reason why, we used a sup- port vector regression (SVR) for the regression of the membership functions and the belief functions that have similar properties which are introduced as constraints in the optimization problem. The proposed method is different from the classical multiple regression by SVM due to the constraint over the outputs. The evaluation of the proposed approach is made in a pattern recognition context. For pattern recognition, many methods were developed within the framework of the theories of uncertainty using existing methods such as neural networks, k-nearest neigh- bors or decision trees, giving new approaches for classification such as fuzzy classifiers [18–20] or belief classifiers [21–25]. Various attempts were proposed to integrate fuzziness in the SVM. Indeed, learning can be carried out using weighted mem- bership functions [26]. In [27–29], a membership function is 1566-2535/$ - see front matter Ó 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.inffus.2009.12.007 * Corresponding author. Address: GSCM-LRIT, Mohammed V-Agdal University, Faculty of Sciences of Rabat, Morocco. Tel.: +33 672305438; fax: +33 344234477. E-mail address: hicham.laanaya@gmail.com (H. Laanaya). Information Fusion 11 (2010) 338–350 Contents lists available at ScienceDirect Information Fusion journal homepage: www.elsevier.com/locate/inffus