Int J Soc Robot
DOI 10.1007/s12369-010-0071-x
Communication of Emotion in Social Robots through Simple
Head and Arm Movements
Jamy Li · Mark Chignell
Accepted: 2 August 2010
© Springer Science & Business Media BV 2010
Abstract Understanding how people perceive robot ges-
tures will aid the design of robots capable of social inter-
action with humans. We examined the generation and per-
ception of a restricted form of gesture in a robot capable of
simple head and arm movement, referring to point-light an-
imation and video experiments in human motion to derive
our hypotheses. Four studies were conducted to look at the
effects of situational context, gesture complexity, emotional
valence and author expertise. In Study 1, four participants
created gestures with corresponding emotions based on 12
scenarios provided. The resulting gestures were judged by
12 participants in a second study. Participants’ recognition
of emotion was better than chance and improved when sit-
uational context was provided. Ratings of lifelikeness were
found to be related to the number of arm movements (but not
head movements) in a gesture. In Study 3, five novices and
five puppeteers created gestures conveying Ekman’s six ba-
sic emotions which were shown to 12 Study 4 participants.
Puppetry experience improved identification rates only for
the emotions of fear and disgust, possibly because of limi-
tations with the robot’s movement. The results demonstrate
the communication of emotion by a social robot capable of
only simple head and arm movement.
Funding provided by the Japan Society for the Promotion of Science
(JSPS), the Natural Sciences and Engineering Research Council of
Canada (NSERC), Bell University Labs and the University of Toronto.
J. Li ( ) · M. Chignell
Department of Mechanical and Industrial Engineering, University
of Toronto, 5 King’s College Road, Toronto, ON M5S 3G8,
Canada
e-mail: jamy.li@utoronto.ca
M. Chignell
e-mail: chignell@mie.utoronto.ca
Keywords Human-robot interaction · Gesture design ·
Communication of emotions · Puppetry
1 Introduction
Modern robots are no longer being designed only to function
as manufacturing aids, but they are also being introduced
as social partners. In light of the roles robots are adopting
as household pets (e.g., Sony’s AIBO), domestic helpers
(iRobot’s Roomba), healthcare assistants (RIKEN Japan’s
Ri-Man), emotional companions (AIST’s PARO) and ed-
ucational aids (MIT’s Kismet and Leo), appropriate social
behaviour is critical for people to develop personal relation-
ships with such agents. Many authors have called for better
design of robots capable of engaging in meaningful social
interactions with people (e.g., [1, 2]). This new breed of
robots is called “socially interactive robots” or “social ro-
bots” [1].
The use of gestures has been identified as crucial to the
design of such robots [2]. Research on robot gestures is
needed because: (1) studying gesture interpretation is nec-
essary to improve human-robot interaction especially for ro-
bots that have limited ability for vocal and facial expressiv-
ity; (2) previous research in HRI has focused on how ges-
tures are created without evaluating people’s understanding
of those gestures, so little is known about what factors affect
gesture perception; and (3) no previous work has investi-
gated the characteristics of “good” designers and the role of
expertise in gesture authorship. Current practice in the de-
sign of robot gestures has robot inventors and researchers
devising gestures based on their own experience and some-
times drawing upon fields such as dance (e.g. [3]). These
methods may be convenient but little work has been done to