Design Support System for Emotional Expression of Robot Partners using Interactive Evolutionary Computation Koh Nishimura Graduate School of System Design Tokyo Metropolitan University Tokyo, Japan nishimura-koh@sd.tmu.ac.jp Naoyuki Kubota Graduate School of System Design Tokyo Metropolitan University Tokyo, Japan kubota@tmu.ac.jp Jinseok Woo Graduate School of System Design Tokyo Metropolitan University Tokyo, Japan woo-jinseok@sd.tmu.ac.jp Abstract—Recently, the need of robot partners is increasing. Such robots should have an emotional model in order to co-exist and to realize the natural communication with people. In the communication, nonverbal communication and emotional expression based on emotional model are very important for robot partners. Moreover, facial and gestural expression should be adaptive to a user of the robot. Therefore, we propose a design support system of arm gestural and facial expression of robot partners based on interactive evolutionary computation and Laban features. Next, we conduct several experiments of the proposed method, and discuss the effectiveness of the proposed method. Keywords- Robot Partners; Interactive Evolutionally Computa- tion; Emotional Expressions; Emotional Model I. INTRODUCTION Recently, robot partners are wished to play roles at home. Such robots should have an emotional model in order to co- exist and to realize natural communication with people. If robot partners can perceive human behavior based on an emotional model, they could have the sympathetic understanding like people. The relationship between emotion and communication of human beings has been discussed from various kinds of viewpoints such as psychology, sociology, neurophysiology, and brain science [1-8]. In general, there are two approaches; macroscopic and microscopic discussions. In the discussion of emotion in psychology from the macroscopic point of view, a human body is considered as a black box, and the internal processing in mind has been discussed. The relationship between emotional expressions and social behaviors has been discussed in sociology. For example, Keltner and Kring discussed the roles of emotion from three different viewpoints in social interaction; (1) Informative functions, (2) Evocative functions, and (3) Incentive functions [5]. In informative functions of emotion, emotional expression conveys information about senders themselves and also about objects and events in the social environments. In evocative functions of emotion, one individual’s emotional expression serves as a social affordance that evokes “prepared” responses with others. In incentive functions of emotion, an individual’s expression and experience of emotion may provide incentives or reinforce another individual’s social behavior within ongoing interactions. In this way, emotional expression plays very important role in social interactions. On the other hand, the emotion is discussed from the microscopic point of view as neurotransmitters and neuronal networks in brain science. In American heritage dictionary, emotion is defined as an intense mental state that arises subjectively rather than through conscious effort and is often accompanied by physiological changes. Furthermore, the constructivism approach on the emotion is often discussed from the viewpoint of modeling by mathematical models and systems model based on the theories discussed in psychology and brain science. To summarize, in the researches on the emotion, it is very important to analyze and understand the human body and human society. Natural communication based on the emotion is one of important topics in the research of human-friendly robots [9- 11]. In general, human communication is done through the perception of other’s intention and feeling. This indicates that the emotional model is very helpful for a human to understand the state of a robot. Furthermore, the emotions influence human actions like incentive functions of emotion. Human emotion is complexly connected to facial expression and feature of body movement such as arm gesture. Human emotion influences emotional expression, while emotional expression influences human emotion. This is a mutually nesting structure. For example, emotion alters facial expression and feature of gesture unconsciously. To the contrary, facial expression and arm gesture cause emotion to correspond to emotional face and feature of gesture movement. Arm gesture has common sign and geometric information in addition to emotional information. Therefore we can know other’s emotion from visual information instinctively. In this research, we focus on emotional expressions of human-like robot partners, because it is good for natural communication with people. If a robot partner has human-like U.S. Government work not protected by U.S. copyright WCCI 2012 IEEE World Congress on Computational Intelligence June, 10-15, 2012 - Brisbane, Australia FUZZ IEEE