Artificial Emotions in Human-Robot Collaboration Jekaterina Novikova * University of Bath Figure 1: Expressing artificial emotions with a Lego Mindstorms NXT robot. 1 Introduction The most advanced service robots are still not smart enough to per- form service or household tasks in complex working environments such as homes, offices or hospitals. A challenge for making au- tomation a team player in joint human - agent activity states “To be an effective team player, intelligent agent must be able to make per- tinent aspects of their status and intentions obvious to their team- mates” [2004]. During my PhD I aim to address the problem of enabling humans to better understand machines by examining the role of artificial emotions synthesized and expressed by robots in the process of human-robot collaboration. 2 Current Research I have proposed an approach for modelling action selection based on artificial emotions and signalling a robot’s internal state to hu- man team member. I also have completed a series of studies with an expressive non-humanoid Lego robot. 2.1 Modelling Artificial Emotions I have developed the initial framework for modelling artificial robotic emotions [2013]. This framework is based on the previous research in computational theories of emotion combining appraisal and dimensional models [2005]. The first phase of the emotional action selection includes detecting specific internal and/or external conditions. For determining an appropriate emotional state a sim- ple valence-arousal representation is used, in a manner analogous to Russell’s approach. All the detected conditions influence both valence and arousal values and thereby a robot’s emotive response. I also use intensity as an additional property of an emotion, which is changed dynamically while robot is experiencing an emotion. In- tensity depends on time, number of detected stimuli, and an impact factor of an executed behaviour. Each emotion calls a specific be- haviour of a dynamic plan. An impact factor is used as a property of a behaviour that depresses the intensity of the emotion this be- * e-mail:j.novikova@bath.ac.uk haviour was triggered by. ’Feeling’ an emotion is modelled as a latched process, during which an intensity of the emotion is increas- ing over time from zero value until the maximum threshold of 100, and is reducing back to zero after the executing behaviour inhibits it. 2.2 Expressing and Observing Artificial Emotions A series of studies was conducted [2013] in order to better under- stand whether a non-humanoid Lego Mindstorms NXT robot can express artificial emotions in a manner that is understandable for a human. In the first study participants were presented with static pictures of different robot expressions and asked to guess the ob- served robot emotion. The second study was designed to overcome the identified methodological limitations: 1) a static image may not convey the same meaning as the experience of seeing an expressive state performed in real time; 2) forcing participants to use emo- tional labels undermines the validity of claims that emotional terms are spontaneously appropriate for robot signals, and 3) there was no context given to participants within which to interpret the signals. The results show that participants tend to describe observed robotic expressions in an emotional tone and are usually more confident about their judgment on observed emotion when its presented in a dynamic way comparing to a static presentation. Participants had fair to moderate agreement on the emotions having high arousal level. The emotions with positive valence were more difficult to recognize and agree about. The thematic map was created showing the main themes developed during of the analysis of qualitative data collected during the studies. References GEBHARD, P. 2005. Alma: a layered model of affect. In Proceed- ings of the 4th international joint conference on Autonomous agents and multiagent systems, ACM, 29–36. KLEIN, G., WOODS, D. D., BRADSHAW, J. M., HOFFMAN, R. R., AND FELTOVICH, P. J. 2004. Ten challenges for making automation a. Intelligent Systems, IEEE 19, 6, 91–95. NOVIKOVA, J., AND WATTS, L. 2013. Artificial emotions to assist social coordination in hri. In Workshop on Embodied Communi- cation of Goals and Intentions at the ICSR. NOVIKOVA, J., GAUDL, S., AND BRYSON, J. J. 2013. Emotion- ally driven robot control architecture for human-robot interac- tion. In Proceedings of TAROS.