RACHEL: DESIGN OF AN EMOTIONALLY TARGETED INTERACTIVE AGENT
FOR CHILDREN WITH AUTISM
Emily Mower
1
, Matthew P. Black
1
, Elisa Flores
2
, Marian Williams
3
, Shrikanth Narayanan
1
University of Southern California (USC)
1
Signal Analysis and Interpretation Laboratory, USC, Los Angeles, California, USA
2
USC University Center for Excellence in Developmental Disabilities at Children’s Hospital Los Angeles
3
Keck School of Medicine, USC, Los Angeles, California, USA
{mower, matthepb}@usc.edu, {elflores, mwilliams}@chla.usc.edu, shri@sipi.usc.edu
ABSTRACT
Increasingly, multimodal human-computer interactive tools are
leveraged in both autism research and therapies. Embodied conver-
sational agents (ECAs) are employed to facilitate the collection of
socio-emotional interactive data from children with autism. In this
paper we present an overview of the Rachel system developed at the
University of Southern California. The Rachel ECA is designed to
elicit and analyze complex, structured, and naturalistic interactions
and to encourage affective and social behavior. The pilot studies
suggest that this tool can be used to effectively elicit social conversa-
tional behavior. This paper presents a description of the multimodal
human-computer interaction system and an overview of the collected
data. Future work includes utilizing signal processing techniques to
provide a quantitative description of the interaction patterns.
Index Terms: Embodied conversational agent, multimodal inter-
face, audio-video recording, autism, children’s speech
1. INTRODUCTION
It is currently estimated that 1 in 110 children is diagnosed with
autism [1]. Autism spectrum disorders (ASD) are pervasive devel-
opmental disorders characterized by difficulties in social communi-
cation, social reciprocity, and repetitive or stereotyped behaviors and
interests [2]. Specific communication impairments may include: de-
lays in spoken language; reduced communication maintenance abil-
ities; repetitive and stereotyped language usage; and deficits in sym-
bolic and imaginative play [3]. These deficits may be manifested in
both verbal expressions (e.g., prosody, word choice) and nonverbal
behaviors (e.g., facial expressions, gesturing, and use of eye contact).
Quantitative evaluations can aid in identifying child-specific social
deficits, potentially producing, in the long-term, more effective and
targeted interventions. Embodied conversational agents (ECAs) pro-
vide an important platform for analyzing a child’s multimodal com-
municative patterns. ECAs produce consistent and modifiable sce-
narios permitting a controlled evaluation of a child’s communicative
abilities. The goal of the current work is to determine how ECA
technologies can be designed to elicit and evaluate natural affective
child communication patterns.
Social interactions are oftentimes challenging for children with
ASD, due to deficits in one or more of the following: social
communication, reciprocal social interaction, and imagination [4].
ECAs provide a context in which to practice and learn aspects of
these social skills. These child-agent interactions also provide a
semi-structured environment, in which researchers can collect child-
specific interaction data. This style of data collection provides con-
texts in which specific behavioral and interaction patterns can be
modeled and analyzed [5]. In this paper we introduce “Rachel,” a
system for eliciting social narrative data in a semi-structured con-
text. The Rachel system has been specifically designed to facilitate
a study of the interaction patterns of children with ASD. The goal of
the system is to create a highly-controlled interaction scenario that
will facilitate post-hoc analysis furthering the understanding of the
interaction patterns that exist between children with ASD and ma-
chines, as well as differences between children with ASD and typi-
cally developing children. The future goals of this work include pro-
viding therapists and clinicians with an evaluation of child-specific
social deficits that can be targeted for intervention therapies, as well
as exploring the potential of ECAs as intervention partners.
The use of ECA technology in autism therapy and research is
widespread. These interactive technologies provide an alternative
approach to provide children with a method through which to ex-
plore and learn social communication strategies. In [4, 6] the authors
used an ECA for collaborative, interactive, and child-directed social
storytelling (narrative) task allowing children to learn social interac-
tive behavior in a socially non-threatening environment. In [7], the
authors analyzed the differences in interaction patterns when a hu-
man is interacting with a virtual agent vs. another human in terms
of speech activity and gaze patterns. In [8] the authors performed a
similar comparison using data collected from children. ECAs have
also been successfully used in educational contexts [9] and in lan-
guage learning tasks [10].
The Rachel system is a novel experimental paradigm for the col-
lection of emotionally grounded narrative data in a structured and
controlled environment. This tool is adapted from an earlier ECA in-
terface developed at the University of Southern California, as a part
of the Children’s Interactive Multimedia Project (ChIMP) [8,11–13].
The Rachel experiments are broken into four distinct sessions, which
are ordered by anticipated increasing difficulty. Each session is de-
signed to teach a concept of emotional behavior while providing an
indication of each child’s unique emotional understanding. Accom-
panying each session is a storytelling task to capture the child’s abil-
ity to create a narrative and to further assess the child’s emotional
understanding.
The novelty of this work is in the design of an ECA that facil-
itates the collection of synchronized multimodal data (audio, video,
physiological) of interactions between a child, his/her parent, and an
ECA. This experimental paradigm will permit the targeted evalua-
tion of child emotional and language expression patterns in an inter-
active context. Pilot results from two children with autism suggest
978-1-61284-350-6/11/$26.00 ©2011 IEEE