MULTIMODAL AFFECT MODELING AND RECOGNITION FOR EMPATHIC ROBOT COMPANIONS GINEVRA CASTELLANO School of Electronic, Electrical and Computer Engineering, University of Birmingham, Birmingham, United Kingdom g.castellano@bham.ac.uk IOLANDA LEITE, ANDRÉ PEREIRA, CARLOS MARTINHO and ANA PAIVA INESC-ID and Instituto Superior T ecnico, Technical University of Lisbon, Porto Salvo, Portugal PETER W. MCOWAN School of Electronic Engineering and Computer Science, Queen Mary University of London, London, United Kingdom Received 5 July 2012 Accepted 21 January 2013 Published 2 April 2013 A®ect recognition for socially perceptive robots relies on representative data. While many of the existing a®ective corpora and databases contain posed and decontextualized a®ec- tive expressions, a®ect resources for designing an a®ect recognition system in naturalistic humanrobot interaction (HRI) must include context-rich expressions that emerge in the same scenario of the ¯nal application. In this paper, we propose a context-based approach to the collection and modeling of representative data for building an a®ect-sensitive robotic game companion. To illustrate our approach we present the key features of the Inter-ACT (INTEr- acting with RobotsA®ect Context Task) corpus, an a®ective and contextually rich multi- modal video corpus containing a®ective expressions of children playing chess with an iCat robot. We show how this corpus can be successfully used to train a context-sensitive a®ect recognition system (a valence detector) for a robotic game companion. Finally, we demonstrate how the integration of the a®ect recognition system in a modular platform for adaptive HRI makes the interaction with the robot more engaging. Keywords: A®ect recognition; non-verbal behavior; context-sensitivity; humanrobot interaction; social robotics. International Journal of Humanoid Robotics Vol. 10, No. 1 (2013) 1350010 (23 pages) ° c World Scienti¯c Publishing Company DOI: 10.1142/S0219843613500102 1350010-1 Int. J. Human. Robot. 2013.10. Downloaded from www.worldscientific.com by 5.39.68.172 on 06/21/14. For personal use only.