Emotional Expression Humanoid Robot WE-4RII -Evaluation of the perception of facial emotional expressions by using fMRI- M. Zecca 1,2 , T. Chaminade 3 , M.A. Umiltà 4 , K. Itoh 2,5 , M. Saito 6 , N. Endo 6 , Y. Mizoguchi 6 , S. Blakemore 3 , C. Frith 3 , V. Gallese 4 , G. Rizzolatti 4 , S. Micera 7 , P. Dario 2,7 , H. Takanobu 2,8,9 , A. Takanishi 1,2,5,9,10 1. Consolidated Research Institute for Advanced Science and Medical Care, Waseda University, Tokyo, Japan, email: zecca@aoni.waseda.jp ; takanisi@waseda.jp 2. RoboCasa, Waseda University, Tokyo, Japan 3. Wellcome department of imaging neuroscience University College of London 4. Dipartimento di Neuroscienze, Sezione di Fisiologia, Università di Parma 5. Department of Mechanical Engineering, Waseda University, Tokyo, Japan 6. Graduate School of Science and Engineering, Waseda University, Tokyo, Japan 7. ARTS Lab, Scuola Superiore Sant’Anna, Pisa, Italy 8. Department of Mechanical Systems Engineering, Kogakuin University, Tokyo, Japan 9. Humanoid Robotics Institute (HRI), Waseda University, Tokyo, Japan 10. Advanced Research Institute for Science and Engineering, Waseda University, Tokyo, Japan Personal robots and robot technology (RT)-based assistive devices are expected to play a major role in our elderly-dominated society, with an active participation to joint works and community life with humans. In order to achieve this smooth and natural integration between humans and robots, interaction also at emotional level is a fundamental required. Objective of this research, therefore, is to clarify how the emotions expressed by a humanoid robot are perceived by humans. The preliminary results show several similarities but also several differences in perception. Key Words: facial emotional expression, humanoid robot, fMRI 1 Introduction Japan has the world's highest percentage of senior citizens over 65 (21%) and the smallest percentage of children under 15 (13.6%) [1]. These figures show that Japanese society is aging much faster than expected, and they underscore the effects of a shrinking birthrate [2]. In this aging society, it is expected that there will be a growing need for home, medical and nursing care services, including those provided by robots, to assist the elderly both on the physical and the psychological levels [3]. In this regard, human-robot communication and interaction are very important, particularly in the case of home and personal assistance for elderly and/or handicapped people. If a robot had a "mind" (intelligence, emotion, and will) similar to the human one, it would be much easier for the robot to achieve smooth and natural adaptation and interaction with its human partners and the environment [4]. Takanishi et al have been developing developed the WE-3 (Waseda Eye No.3) series since 1995. So far they have achieved coordinated head-eye motion with V.O.R. (Vestibular-Ocular Reflex), depth perception using the angle of convergence between the two eyes, adjustment to the brightness of an object with the eyelids and four senses, visual, auditory, cutaneous and olfactory sensations. In addition, they obtained the expression of emotions by using not only the face, but also the upper-half of the body with the Emotion Expression Humanoid Robot WE-4 (Waseda Eye No.4) series with the waist, 9-DOFs emotion expression humanoids arms and humanoid robot hands RCH-1 (Robo Casa Hand No.1) [5-7]. WE-4RII transmission of emotions was evaluated by showing the movies of its six basic emotional expressions exhibited to many subjects. The users chose the emotion they thought the robot expressed. The averaged recognition rate of all emotional expressions of WE-4RII was 93.5 [%], which proved that WE-4RII can effectively convey its emotions using its upper-half bodily expressions [7]. However, this kind of analysis lacks of objectivity. In order to obtain more objective data about the user perception of the emotions, a different approach should be pursued. The mirror neuron system [8] is an area of our brain whose neurons fire both when we perform an action and when we observe the same action performed by someone else. The function of the mirror system is a subject of much speculation. These neurons may be important for understanding the actions of other people, and for learning new skills by imitation. It is also considered that the Mirror Neuron System plays an important role in the recognition of emotions. Objective of this research, therefore, is to clarify how the emotions expressed by a humanoid robot are perceived by humans. 2 Material and Methods 2.1 Emotion Expression Humanoid Robot WE-4RII The Emotion Expression Humanoid Robot WE-4RII (see Fig. 1) developed in Takanishi lab is capable of expressing 6 different emotions (Happyness, Anger, Surprise, Sadness, Disgust, Fear) by using facial expressions and movements of the neck, the arms and the hands [7].