A Na¨ ıve Bayes Classifier with Distance Weighting for Hand-Gesture Recognition Pujan Ziaie, Thomas M¨ uller, Mary Ellen Foster, and Alois Knoll Technical University of Munich, Dept. of Informatics VI, Robotics and Embedded Systems, Boltzmannstr. 3, DE-85748 Garching, Germany, {ziaie,muelleth,foster,knoll}@cs.tum.edu Abstract. We present an effective and fast method for static hand ges- ture recognition. This method is based on classifying the different ges- tures according to geometric-based invariants which are obtained from image data after segmentation; thus, unlike many other recognition meth- ods, this method is not dependent on skin color. Gestures are extracted from each frame of the video, with a static background. The segmen- tation is done by dynamic extraction of background pixels according to the histogram of each image. Gestures are classified using a weighted K-Nearest Neighbors Algorithm which is combined with a nave Bayes approach to estimate the probability of each gesture type. Key words: Image Processing; Gesture Recognition; K-Nearest Neighbors; Nave Bayes; Classification; Human-robot interaction 1 Introduction When humans interact with one another – and with artificial agents – they make extensive use of a range of non-verbal behavior in addition to communicating via speech. Processing and understanding the non-verbal parts of human com- munication are crucial to supporting smooth interaction between a human and a robot. We concentrate on the task of hand-gesture recognition : recognizing and clas- sifying the hand shapes and motions of a human user in the context of a coop- erative human-robot assembly task. Hand gestures play an important role in this type of interaction, both as an accompaniment to speech and as a means of input in their own right. For example, if a user wants to tell a robot to pick up a certain object among many other objects, it can be difficult to indicate the desired object using only speech. However, if the user combines saying “Pick up that object.” with a pointing gesture at the target object, this can be easier to process. Hand gestures can also themselves provide strong indications of the users intentions in the absence of speech: for example, users might move their hand near an object in preparation for picking it up, or may hold out their hand to indicate that they need the robot to hand over a particular object.