The role of encoding and attention in facial emotion memory: An
EEG investigation
Colleen A. Brenner
a,
⁎, Samuel P. Rumak
a
, Amy M.N. Burns
a
, Paul D. Kieffaber
b
a
University of British Columbia, Department of Psychology 2136 West Mall, Vancouver, British Columbia V6T 1Z4, Canada
b
College of William & Mary, Department of Psychology, P.O. Box 8795, Williamsburg, VA 23187-8795, USA
abstract article info
Article history:
Received 28 January 2014
Received in revised form 29 April 2014
Accepted 10 June 2014
Available online xxxx
Keywords:
Event-related potential
N170
P100
N250
Theta
Emotion
Attention
Memory
Facial expressions are encoded via sensory mechanisms, but meaning extraction and salience of these
expressions involve cognitive functions. We investigated the time course of sensory encoding and subsequent
maintenance in memory via EEG. Twenty-nine healthy participants completed a facial emotion delayed
match-to-sample task. P100, N170 and N250 ERPs were measured in response to the first stimulus, and evoked
theta power (4–7 Hz) was measured during the delay interval. Negative facial expressions produced larger N170
amplitudes and greater theta power early in the delay. N170 amplitude correlated with theta power, however
larger N170 amplitude coupled with greater theta power only predicted behavioural performance for one emo-
tion condition (very happy) out of six tested (see Supplemental Data). These findings indicate that the N170 ERP
may be sensitive to emotional facial expressions when task demands require encoding and retention of this in-
formation. Furthermore, sustained theta activity may represent continued attentional processing that supports
short-term memory, especially of negative facial stimuli. Further study is needed to investigate the potential
influence of these measures, and their interaction, on behavioural performance.
Crown Copyright © 2014 Published by Elsevier B.V. All rights reserved.
1. Introduction
Emotional facial expressions are an efficient way of communicating
one's emotional state. This information is extremely important in situa-
tions where socially appropriate responses require an accurate reading
of the emotional states of others. Emotional expressions are quickly
translated from sensory signals to higher order cognitive networks for
further processing and integration with broader executive processes,
including memory (Adolphs, 2003). Thus far, there have been very few
systematic studies specifically investigating the role of emotion on
early sensory processing of facial emotions, their maintenance in short-
term memory via sustained EEG activity, and whether the interplay be-
tween sensory- and maintenance-related activity affects performance.
Electrophysiological methods have the advantage of capturing the
brain's response to facial expressions on a millisecond timescale,
matching the timing of facial expression recognition in the course of a
natural interaction. Scalp-recorded event-related potentials (ERPs)
reflect the synchronized firing of large populations of neurons that are
time-locked to a stimulus. Previous research has identified that the
P100, the N170 and the N250 ERPs can be elicited by visual stimuli,
with the N170 and N250 particularly sensitive to facial stimuli (Bentin
et al., 1996; Herrmann et al., 2005a, 2005b; Streit et al., 2000). While
several other face-sensitive ERPs have been reported in the literature
(EPN, N400, and LPC), we focus on the P100, N170 and N250 as repre-
sentations of processes that are morphologically well-characterized in
the current study, are consistent with our recording parameters and
choice of reference, and occur early enough to ostensibly reflect sensory
rather than cognitive processing.
1.1. Relevant ERPs: P100, N170 and N250
The P100 ERP is a positive deflection that peaks between 80 and
120 ms after a visual stimulus, and is thought to reflect attention-
based early visual processing (Mangun and Hillyard, 1991; Mangun,
1995). The P100 is larger in response to expected rather than unexpect-
ed stimuli, and varies depending on stimulus properties and location
(Nakamura et al., 2001; Regan, 1989). Localization studies place the
generator of the P100 in bilateral occipital areas and fusiform gyrus
(Herrmann et al., 2005a; Taylor et al., 2011; T. K. W. Wong et al.,
2009). It is therefore considered an early index of attention-
modulated sensory processing. The data regarding face processing and
the P100 ERP are somewhat inconsistent, with some studies finding
P100 amplitude sensitive to faces compared to non-face stimuli, while
others fail to find such modulation (Herrmann et al., 2005b; Jacques
and Rossion, 2006; Liu et al., 2002; Utama et al., 2009a, 2009b; A. C.-N.
Wong et al., 2009). By presenting intact and scrambled faces versus ob-
jects, Rossion and Caharel (2011) demonstrated that the P100 reflects
processing of low-level visual cues that are not related to the experience
International Journal of Psychophysiology xxx (2014) xxx–xxx
⁎ Corresponding author. Tel.: +1 604 822 4650.
E-mail addresses: cbrenner@psych.ubc.ca (C.A. Brenner), s.rumak@psych.ubc.ca
(S.P. Rumak), aburns@psych.ubc.ca (A.M.N. Burns), pdkieffaber@wm.edu (P.D. Kieffaber).
INTPSY-10810; No of Pages 13
http://dx.doi.org/10.1016/j.ijpsycho.2014.06.006
0167-8760/Crown Copyright © 2014 Published by Elsevier B.V. All rights reserved.
Contents lists available at ScienceDirect
International Journal of Psychophysiology
journal homepage: www.elsevier.com/locate/ijpsycho
Please cite this article as: Brenner, C.A., et al., The role of encoding and attention in facial emotion memory: An EEG investigation, Int. J.
Psychophysiol. (2014), http://dx.doi.org/10.1016/j.ijpsycho.2014.06.006