Analysis of Information Measure for Sensory Motor Contingency toward Modeling Social Development Hamed MAHZOON, Osaka University, hamed.mahzoon@irl.sys.es.osaka-u.ac.jp Yuichiro YOSHIKAWA, Osaka University Hidenobu SUMIOKA, ATR Hiroshi ISHIGURO, Osaka University In this research, we propose a developmental learning mechanism for social interaction of robot with human. We focus on face-to-face interaction between a caregiver and an infant, and try to produce a stochastic model for obtaining social skills based on information theory. Previous studies proposed a mechanism based on transfer entropy [1] in simulation and real time robot implementation, and show how robot can acquire social skill like joint attention with caregiver, or can develop its behavior. However, their method was too time consuming especially for real time implementation or couldn’t develop its social behavior when it was implemented in real world robot. We compare our new model with previous method and show how it can evaluate the existing contingency based on the experiences of the robot during interaction, in short time period which feasible for implementing in real word robot and how it can develop its acquired skills. Key Words: Transfer Entropy, Chain of Causality, Social Development 1. Introduction How can we make a system that can develop itself in social interaction? Which properties a developmental robot should have in order to become closer to human? Recently, researches of intelligent robots which act during interaction with human have been studied more actively [2][3][4]. These robots should be able to decide suitable action pattern against observation signals to establish communication with human. Actions like following the gaze of each other or pointing to a place are part of joint attention skill which is essential to continue communication with another agent. In cognitive developmental robotics [5], synthetic studies on developmental model of robot have been received attention not only for communication ability with human, but also to understand human developmental process. In the field of developmental psychology, studies on how infants acquire skills related to joint attention is one of the central topic [6]. In the field of robotics, behavior like joint attention and its acquisition mechanism have been studied. Previous researches focused on contingent relation between human gaze direction and existence of the object, and showed that learning sensorimotor mapping from human face pattern to the robot’s motor command leads the robot to acquire gaze following skill [3][7]. A mechanism for finding contingency between multimodal sensory signals and actions are proposed based on information theory [8]. In this work, the developmental process and sequential acquisition of communication skills of robot compared with human infant, however, it is too time consuming to implement it in real word robot. A model to make this mechanism able to be implemented in robot for real time interaction with human is produced [9]. In this research, improved algorithms for short time contingency finding with increased number of robot sensory information and action ability are proposed, and shown how robot behavior changes during interaction with human by social skill acquisition. However, robot couldn’t develop its acquired skill to more complex one, which shown in previous work [8] as social referencing skill. In this research, we focus on the concept of elements of contingency for finding complex contingency in short time, which leads a robot be able to develop its acquired skills in real word. We establish a simple computer simulation environment and show how our new information measure can detect contingency even with less robot’s experience. Finally, we compare our new measure with previous work’s measure. 2. Problem Setting We assume a face-to-face interaction of a human caregiver and a robot like previous work [9]. In each time step, they have observation from environment and other agent, and send command to its each joint for action. The robot gets the observation of each time step as sensory variable S. It saves taken action in each time step as action variable A. In addition, robot keeps a resultant observation of each time step action in resultant variable R. During interaction with the caregiver, robot evaluate the causality depends on the current value of set of event variables, consist of set of variables S, A and R. In the next section, we describe the way we use in this research to quantify existing causality. 3. Proposed Mechanism Our proposed structure is based on previous work [8], however we improved it to overcome some shortages. We will talk about these shortages in the next part. Fig. 1 shows the structure of our new mechanism, which causes a robot can find exiting contingency during the interaction with the caregiver, and develop its behavior by acquiring skills which reproduce contingency, and use it during next interaction steps. Proposed system consist of four main units: 1) contingency detection unit (CDU), 2) contingency reproducing unit (CRU), 3) reactive behavior producing unit (RBU), and 4) action selector. At each time step, the robot first evaluates the contingency of experienced events, using contingency detection unit. If the robot detected the experienced event contingent, it saves the combination of elements of that event set as a contingent skill (CS) in CRU. Then, decide the each joint motion based on action decision method of action selector. At the first steps of the interaction, the robot has got no skill in its CRU and act based on RBU’s output action. However, after some experience during interaction, the robot start to add detected The Japan Society of Mechanical Engineers NII-Electronic Library Service