ARTICLE IN PRESS Facial expressions and complex IAPS pictures: Common and differential networks Jennifer C. Britton, a, * Stephan F. Taylor, b Keith D. Sudheimer, a and Israel Liberzon b,c a Department of Neuroscience, University of Michigan, Ann Arbor, MI 48109, USA b Department of Psychiatry, University of Michigan, Ann Arbor, MI 48109, USA c Psychiatry Service, Ann Arbor VAMC, Ann Arbor, MI 48105, USA Received 20 July 2005; revised 11 December 2005; accepted 16 December 2005 Neuroimaging studies investigating emotion have commonly used two different visual stimulus formats, facial expressions of emotion or emotionally evocative scenes. However, it remains an important unanswered question whether or not these different stimulus formats entail the same processes. Facial expressions of emotion may elicit more emotion recognition/perception, and evocative pictures may elicit more direct experience of emotion. In spite of these differences, common areas of activation have been reported across different studies, but little work has investigated activations in response to the two stimulus formats in the same subjects. In this fMRI study, we compared BOLD activation patterns to facial expression of emotions and to complex emotional pictures from the International Affective Picture System (IAPS) to determine if these stimuli would activate similar or distinct brain regions. Healthy volunteers passively viewed blocks of expressive faces and IAPS pictures balanced for specific emotion (happy, sad, anger, fear, neutral), interleaved with blocks of fixation. Eye movement, reaction times, and off-line subjective ratings including discrete emotion, valence, and arousal were also recorded. Both faces and IAPS pictures activated similar structures, including the amygdala, posterior hippocampus, ventromedial prefrontal cortex, and visual cortex. In addition, expres- sive faces uniquely activated the superior temporal gyrus, insula, and anterior cingulate more than IAPS pictures, despite the faces being less arousing. For the most part, these regions were activated in response to all specific emotions; however, some regions responded only to a subset. D 2006 Elsevier Inc. All rights reserved. Introduction Emotion research utilizes different types of stimuli (e.g. expressive faces and complex evocative pictures) to probe affective processing; however, the two lines of investigation have remained relatively separate. Facial expressions are often viewed as external signals of experienced emotions that communicate information to the observer (Frank and Stennett, 2001). Facial expressions portraying specific emotions (e.g. happy, sad, anger, fear) are universally recognized (Ekman, 1992, 1994; Izard, 1994) and each expression of discrete emotion has meaning, targeting a specific response (Halberstadt and Niedenthal, 1997). Even though facial expressions are used frequently as probes of emotion recognition, some studies have shown that faces can be inducers of emotion (Hatfield et al., 1992; Wild et al., 2001). Facial expressions have been also shown to evoke physiological changes (Clark et al., 1992; Esteves and Ohman, 1993) and autonomic activity in response to facial expressions has been shown to correlate with neural activation (Williams et al., 2004). Complex pictures from the International Affective Picture System (IAPS), another common emotional probe, depict emotion-laden scenes to induce affective states. The standardized set of IAPS pictures has been rated in terms of their ability to induce valence (unpleasant/pleasant) and arousal (calm/ excited) changes. These measures have also been correlated with viewer’s heart rate and skin conductance changes, respectively, providing physiological validity to subjectively reported emotion induction (Lang et al., 1993). However, little work has been done to identify the discrete emotions elicited by these pictures. Although both emotional faces and IAPS pictures target emotional processing, these two stimuli sets may preferentially engage certain brain structures involved in emotion. In addition, it is not known whether facial expressions and IAPS pictures of specific emotions (happy, sad, anger, and fear) would activate similar or discrete circuits. Studies of expressive faces and IAPS pictures suggest that a similar set of regions is involved in processing both emotional stimulus types. Expressive faces and IAPS pictures activate regions involved in emotion processing, including the amygdala (Breiter et al., 1996; Liberzon et al., 2003; Morris et al., 1996), hippocampus (Gur et al., 2002; Lane et al., 1997c), insula (Phan et al., 2004; Phillips et al., 1997), anterior cingulate (ACC, Killgore and Yurgelun-Todd, 2004; Morris et al., 1998), medial prefrontal cortex (mPFC, Kim et al., 2003; Taylor et al., 2003; Winston et al., 2003), ventromedial prefrontal cortex (vMPFC, Phan et al., 2004)/orbitofrontal cortex (OFC, Blair et al., 1053-8119/$ - see front matter D 2006 Elsevier Inc. All rights reserved. doi:10.1016/j.neuroimage.2005.12.050 * Corresponding author. Massachusetts General Hospital, Psychiatry Department, Building 149 Thirteenth Street, Charlestown, MA 02129, USA. Fax: +1 617 726 4078. E-mail address: jbritton@nmr.mgh.harvard.edu (J.C. Britton). Available online on ScienceDirect (www.sciencedirect.com). www.elsevier.com/locate/ynimg YNIMG-03679; No. of pages: 14; 4C: 3, 6, 7, 8 DTD 5 NeuroImage xx (2006) xxx – xxx