Classification of Upper and Lower Face Action Units and Facial Expressions using Hybrid Tracking System and Probabilistic Neural Networks HADI SEYEDARABI*, WON-SOOK LEE** AND ALI AGHAGOLZADEH* *Faculty of Electrical and Computer Engineering University of Tabriz,Tabriz, Iran **School of Information Technology and Engineering (SITE) Faculty of Engineering, University of Ottawa,Canada {seyedarabi, aghagol}@tabrizu.ac.ir, wslee@uottawa.ca Abstract: - The most of the human emotions are communicated by changes in one or two of discrete facial features. Theses changes are coded as Action Units (AUs). In this paper, we develop a lower and upper face AUs classification as well as six basic emotions classification system. We use an Automatic hybrid tracking system, based on a novel two-step active contour tracking system for lower face and cross-correlation based tracking system for upper face to detect and track of Facial Feature Points (FFPs). Extracted FFPs are used to extract some geometric features to form a feature vector which is used to classify input image sequences into AUs and basic emotions. The classification is done using Probabilistic Neural Networks (PNN) and a Rule- Based system. Experimental results show robust facial feature detection and tracking and reasonable classification where an average AUs recognition rate is 85.98% for lower face and 86.93% for upper face and average basic emotions recognition rate is 96.11%. Key-Words: - Active contours, Action Units, Facial Expressions, Probabilistic Neural Networks 1 Introduction Face plays an essential role in interpersonal communication. Automating facial expression analysis could bring facial expressions into man- machine interaction. Most computer-vision based approaches [][] to facial expression analysis attempt to recognize only prototypic emotions. These prototypic emotional seem to be universal across human ethnicities and cultures and comprise happiness, sadness, fear, disgust, surprise, and anger. In everyday life, however, such prototypic expressions occur relatively infrequently. Instead, emotion is communicated by changes in one or two discrete features. Theses changes are coded as Action units (Aus). AUs are visible appearance of facial muscles defined by ????, widely used by ??? and is also a standard in MPEG4???. Among the facial features mouth has most flexible deformability and it is highly complicated to track and shape extraction of lips and therefore 17 AUs among 44 AUs for facial features in the Facial Action Coding System (FACS) are directly associated with the movement of mouth [1]. Table 1 shows AUs used in this work that occur in the lower and upper face and are more important in describing facial expressions. In recent years, there has been extensive research on facial expression analysis and recognition. Pantic and Rothkrantz [2] proposed an expert system for automatic analysis of facial expressions from a still full-face image. Their system consists of two major parts, the first one forms a frame work for hybrid facial feature detection and the second part of the system converts low level face geometry into high level facial actions. Table 1. Some of FACS AUs used in this work AU (Upper Face) FACS description AU (Lower Face) FACS description 1 Raised inner brows 12 Mouth corners pulled up 2 Raised outer brows 15 Mouth corners pulled downwards 4 Lowered brows 17 Raised chin 5 Raised upper lid 20 Mouth stretched 6 Raised cheek 23 Lips tightened 7 Raised lower lid 24 Lip pressed 9 Wrinkled nose 25 Lips parted - - 26 Jaw dropped - - 27 Mouth stretched Lien et al. [3] developed a facial expressions recognition system that was sensitive to subtle changes in the face. The extracted feature