Visual scanning of faces in 22q11.2 deletion syndrome: Attention to the mouth
or the eyes?
Linda Campbell
a,b,c,
⁎, Kathryn McCabe
a,b
, Kate Leadbeater
a
, Ulrich Schall
a,b,c
,
Carmel Loughland
a,b,c
, Dominique Rich
d
a
Priority Research Centre for Brain and Mental Health, University of Newcastle, NSW, Australia
b
Schizophrenia Research Institute Australia (SRI), Sydney, NSW, Australia
c
Hunter Medical Research Institute, Australia
d
Centre for Brain and Mental Health Research, University of Newcastle, Australia
abstract article info
Article history:
Received 21 January 2009
Received in revised form 11 June 2009
Accepted 11 June 2009
Keywords:
Emotion face processing
Visual scanpaths
Eye-gaze avoidance
Social functioning
Previous research demonstrates that people with 22q11.2 deletion syndrome (22q11DS) have social and
interpersonal skill deficits. However, the basis of this deficit is unknown. This study examined, for the first
time, how people with 22q11DS process emotional face stimuli using visual scanpath technology. The visual
scanpaths of 17 adolescents and age/gender matched healthy controls were recorded while they viewed face
images depicting one of seven basic emotions (happy, sad, surprised, angry, fear, disgust and neutral).
Recognition accuracy was measured concurrently. People with 22q11DS differed significantly from controls,
displaying visual scanpath patterns that were characterised by fewer fixations and a shorter scanpath length.
The 22q11DS group also spent significantly more time gazing at the mouth region and significantly less time
looking at eye regions of the faces. Recognition accuracy was correspondingly impaired, with 22q11DS
subjects displaying particular deficits for fear and disgust. These findings suggest that 22q11DS is associated
with a maladaptive visual information processing strategy that may underlie affect recognition accuracy and
social functioning deficits in this group.
© 2009 Elsevier Ireland Ltd. All rights reserved.
1. Introduction
One of the most common microdeletion disorders is the 22q11.2
deletion syndrome (22q11DS), which affects 1 in every 4000 live births.
(Oscarsdottir et al., 2004). It is estimated that 20–50% of children with
22q11DS have an autism spectrum disorder (Fine et al., 2005; Vorstman
et al., 2006) and that 30% will go on to develop severe psychiatric
disorder (e.g., schizophrenia) in adulthood (Murphy et al., 1999). Like
autism and schizophrenia, 22q11DS is associated with significant social
dysfunction. The early literature report a broad range of impairments in
social skills and behaviours including withdrawal, shyness, interaction
problems and limited facial expressions (Golding-Kushner et al., 1985;
Gerdes et al., 1999; Swillen et al., 1999a,b; Niklasson et al., 2001). These
deficits in social ability are not only detrimental to the establishment of
positive social relationships in people affected by 22q11DS, but
ultimately may lead to social isolation in this group. Although social
skill deficits have been routinely reported in people with 22q11.2, there
is a lack of empirical studies examining the cognitive and perceptual
mechanisms that may underlie these difficulties. Several studies have
used visual scanpath technology to examined face processing deficits in
other clinical groups including autism spectrum disorders (ASDs; (Jemel
et al., 2006)) and schizophrenia (Loughland et al., 2002). The ability to
perceive and accurately process facial information is important for
positive interpersonal and social communication. This study is the first
to undertake this research in 22q11.2 deletion syndrome.
Faces are complex, highly salient and biologically meaningful visual
stimuli that provide a wealth of information about other people
including their gender, age and person's identity. They allow us to
make inferences about the emotions and intentions of others and their
eye gaze can help orientate our attention to objects or events in our
immediate environment. Unlike complex non-face stimuli that are
recognised in terms of their isolated component features, faces are
recognised holistically based on the configural relationships that exist
between features (Young et al., 1987; Tanaka and Farah, 2003). Face
perception is therefore a relatively automatic process. Emotion percep-
tion requires the activation of a specific network of different brain
regions including the amygdala, hippocampus, fusiform gyrus, thalamus
and medial and inferior frontal cortex. For example, fearful faces are
associated with increased amygdala activation (LeDoux, 1995; Morris
et al., 1996; Adolphs, 2002; Vuilleumier and Pourtois, 2007). Damage to
one or more of these brain regions can result in an inability to accurately
process facial information (Phillips et al., 2003).
One way to examine the strategies people employ while viewing
face stimuli is to record their visual scanpaths. Visual scanpaths are a
schematic map of fixations (when the fovea is directed toward and
Psychiatry Research 177 (2010) 211–215
⁎ Corresponding author. Priority Research Centre for Brain and Mental Health
Studies, PO Box 833, Newcastle, NSW, 2300 Australia. Tel.: +61 4924 6648; fax: +61
4924 6608.
E-mail address: Linda.e.campbell@newcastle.edu.au (L. Campbell).
0165-1781/$ – see front matter © 2009 Elsevier Ireland Ltd. All rights reserved.
doi:10.1016/j.psychres.2009.06.007
Contents lists available at ScienceDirect
Psychiatry Research
journal homepage: www.elsevier.com/locate/psychres