Neurocomputing 70 (2007) 801–808 Bayes classification based on minimum bounding spheres Jigang Wang à , Predrag Neskovic, Leon N. Cooper Department of Physics, Institute for Brain and Neural Systems, Brown University, Providence, RI 02912, USA Available online 15 October 2006 Abstract The minimum bounding sphere of a set of data, defined as the smallest sphere enclosing the data, was first used by Scho¨ lkopf et al. to estimate the VC-dimension of support vector classifiers and later applied by Tax and Duin to data domain description. Given a set of data, the minimum bounding sphere of each class can be computed by solving a quadratic programming problem. Since the spheres are constructed for each class separately, they can be used to deal with the multi-class classification problem easily, as proposed by Zhu et al. In this paper, we show that the decision rule proposed by Zhu et al. is generally insufficient to achieve the state-of-the-art classification performance. We, therefore, propose a new decision rule based on the Bayes decision theory. This new decision rule significantly improves the performance of the resulting sphere-based classifier. In addition to its low computational complexity and easy expandability to multi-class problems, the new classifier achieves comparable performance to the standard support vector machines on most of the real- world data sets being tested. r 2006 Elsevier B.V. All rights reserved. Keywords: Pattern classification; Bayes decision rule; Minimum bounding spheres; Support vector machines; Kernel methods 1. Introduction Given a set of data in some metric space, the minimum bounding sphere is defined to be the smallest sphere that encloses all the data. Similarly, in pattern classification, the minimum bounding sphere for each class is the smallest sphere enclosing all the training data from the correspond- ing class. The minimum bounding sphere can be computed by solving a quadratic programming (QP) problem. In [12], Scho¨lkopf et al. computed the radius of the minimum bounding sphere of all training data to estimate the VC- dimension of a support vector classifier. In [14], Tax and Duin applied the minimum bounding sphere to data domain description (also called one-class classification). Given the fact that the minimum bounding sphere for each class is constructed without considering the distribu- tion of training examples of other classes, it is not immediately clear whether or not an effective classifier can be built based on these class-specific minimum bounding spheres. However, from a computational point of view, there is a great advantage to use the minimum bounding spheres for pattern classification purposes. Most noticeably, a classifier based on the class-specific minimum bounding spheres can deal with multi-class problems easily, because the minimum bounding sphere of each class can be computed separately and only once, which is in direct contrast to the support vector machines (SVMs) with the one-against-one or one-against-rest scheme [3,8], in which the optimal separating hyperplanes have to be computed many times. In addition, the size of the resulting QP problem is also smaller when compared with the standard SPM approach [2,4]. To explore this possibility, Zhu et al. proposed a multi-class classification algorithm that uses the minimum bounding spheres to classify a new example and showed that the resulting classifier performs comparable to the standard SVMs [17]. In this paper we conduct a comparative study using both artificial and real-world data sets and show that the decision rule proposed by Zhu et al. is generally insufficient for achieving the state-of-the-art classification perfor- mance. Motivated by the Bayes decision rule, we propose a new decision rule for making classification decisions based on the constructed minimum bounding spheres. We use the minimum bounding sphere as a means to estimate ARTICLE IN PRESS www.elsevier.com/locate/neucom 0925-2312/$ - see front matter r 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.neucom.2006.10.023 à Corresponding author. Tel.: +1 401 863 3920; fax: +1 401 863 3494. E-mail addresses: jigang@physics.brown.edu (J. Wang), pedja@brown.edu (P. Neskovic), Leon_Cooper@brown.edu (L.N. Cooper).