(1) Eigenface Based Emotion Analysis Algorithm and Implementation to Humanoid Robot Fatma Göngör and Önder Tutsoy Adana Science and Technology University, Department of Electrical and Electronic Engineering, Adana, Turkey {ftmgongor@gmail.com, otutsoy@adanabtu.edu.tr} ABSTRACT: This paper proposes a 3-stage emotion analysis algorithm and presents its application to NAO humanoid robot. Initially, the robot specify the boundaries of the face using Eigenface approach and then, corresponding facial distance measurements are determined with Euclidean Distance measurement technique. Finally, the measured facial distances are classified with Artificial Neural Networks to recognize the instant emotional states of human. The reliability of the performed emotion analysis is verified by analyzing each terminal decision reached based on the facial distance measurements. Keywords: Artificial Neural Networks, emotion analysis, Eigenface, Euclidean distance measurement technique, emotion analysis, facial expression, Humanoid Robot. 1. Introduction Humanoid Robots (HRs) are extensively becoming an integral part of our daily lives. Scenarios in which robots share the same workspace in cooperation with humans are wide spreading For example, robots are used for the purposes of elderly care [1,2], rehabilitation and health care [3- 5], education [6,7], entertainment [8-10], personal companion [11-14], guidance [15-18] and receptionist [19,20]. To achieve these challenging tasks, social abilities are essential for HRs to coexist with humans in congruity and to be efficient in cooperation with humans. One major goal of creating a social interactive HR is to make a robot no longer considered only as tools or machines but more as a partner. To this end, a HR has to behave in a proper and naturel way as in Human-Human Interaction (HHI). Keltner and Kring pointed out a highly dependent relation between emotion and social interaction [21]. They stated that emotions serve a set of functions (such as providing information to the conspecifics about the environment or promoting strong social relationships) that are critical to coordinate social interactions. Furthermore, Jayagopi et al. attempted to identify and build a dataset of social cues including para-verbal and nonverbal cues to improve HRI [22]. Based on these motivations, to pursue a more general social HRI, HRs are expected to perform various advanced tasks such as making decision about content of a communication based on emotions of a human. In this case, the human face is the most powerful candidate to analyze emotional statements. Therefore, designing autonomous HRs with efficient emotion analysis capability leads to more sophisticated social robots design. A great number of facial emotion analysis algorithms have been developed in the literature. Gratch and Marella's stated importance of understanding emotions in education, and focused on computational approaches to make facial emotion analysis [23]. Zhou et al. used Embedded HMM (Embedded Hidden Markov Model) to recognize facial expressions for real-time interactive video games [24], Ballihi et al. developed a further emotion analysis algorithm to only detect negative and positive facial expressions [25]. This algorithm taken into account not only the facial muscle