Identification of emotional intonation evaluated by fMRI D. Wildgruber, a,b, * A. Riecker, a,b I. Hertrich, a M. Erb, b W. Grodd, b T. Ethofer, a and H. Ackermann a a Department of General Neurology, Hertie Institute for Clinical Brain Research, University of Tu ¨bingen, Tu ¨ bingen, Germany b Section MR of CNS, Department of Neuroradiology, University of Tu ¨ bingen, Germany Received 19 July 2004; revised 15 October 2004; accepted 28 October 2004 Available online 15 December 2004 During acoustic communication among human beings, emotional information can be expressed both by the propositional content of verbal utterances and by the modulation of speech melody (affective prosody). It is well established that linguistic processing is bound predominantly to the left hemisphere of the brain. By contrast, the encoding of emotional intonation has been assumed to depend specifically upon right-sided cerebral structures. However, prior clinical and functional imaging studies yielded discrepant data with respect to interhemispheric lateralization and intrahemispheric local- ization of brain regions contributing to processing of affective prosody. In order to delineate the cerebral network engaged in the perception of emotional tone, functional magnetic resonance imaging (fMRI) was performed during recognition of prosodic expressions of five different basic emotions (happy, sad, angry, fearful, and disgusted) and during phonetic monitoring of the same stimuli. As compared to baseline at rest, both tasks yielded widespread bilateral hemodynamic responses within frontal, temporal, and parietal areas, the thalamus, and the cerebellum. A comparison of the respective activation maps, however, revealed comprehension of affective prosody to be bound to a distinct right-hemisphere pattern of activation, encompassing posterior supe- rior temporal sulcus (Brodmann Area [BA] 22), dorsolateral (BA 44/ 45), and orbitobasal (BA 47) frontal areas. Activation within left-sided speech areas, in contrast, was observed during the phonetic task. These findings indicate that partially distinct cerebral networks subserve processing of phonetic and intonational information during speech perception. D 2004 Elsevier Inc. All rights reserved. Keywords: Affect; Communication; Emotion; fMRI; Language; Lateraliza- tion; Prosody; Phonetics; Valence Introduction During speech production, information about a speaker’s emotional state is predominantly conveyed by the modulation of intonation (affective prosody). At the perceptual level, emotional tone is characterized by variations of pitch, syllable durations, loudness, and voice quality across utterances (suprasegmental features) imposed upon segmental verbal information encoded in phonetic/phonological units (Ackermann et al., 1993; Bachorowski and Owren, 2003; Banse and Scherer, 1996; Cutler et al., 1997; Sidtis and Van-Lancker-Sidtis, 2003). As concerning cerebral topography of prosody processing, observations in patients suffer- ing from focal brain lesions indicate that the well-established left- sided dominance for language comprehension does not extend to perception of emotional intonation (Adolphs, 2002; Baum and Pell, 1999; Borod et al., 2001, 2002; Charbonneau et al., 2003; Pell and Baum, 1997). According to the neuroanatomical model proposed by Ross (1981), prosodic information is processed within distinct right-sided perisylvian regions that are organized in complete analogy to left-sided language areas. Expression of affective prosody, thus, is believed to rely on the Broca-homotop within the right inferior frontal cortex, whereas comprehension of intonational information is presumed to be bound to the Wernicke-homotop within the right superior temporal region. However, empirical evidence for this model as provided by Ross (1981) was based on a few case reports only, and more systematic investigations yielded divergent results. Nevertheless, as concerns comprehension of speech melodies, the findings of the majority of lesion studies seem to be compatible with the assumption that perceptual prosidic functions are predominantly bound to the right posterior perisylvian cortex (Borod et al., 2002; Darby, 1993; Heilman et al., 1984; Starkstein et al., 1994). In addition, various clinical examinations indicate a widespread network of—partially bilateral—cerebral regions including the frontoparietal cortex (Adolphs et al., 2002; Breitenstein et al., 1998) and the basal ganglia (Breitenstein et al., 1998, 2001; Cancelliere and Kertesz, 1990; Pell and Leonard, 2003) to contribute to comprehension of emotional intonation. In line with these findings, neuroimaging studies as a rule yielded rightward lateralization of hemodynamic 1053-8119/$ - see front matter D 2004 Elsevier Inc. All rights reserved. doi:10.1016/j.neuroimage.2004.10.034 * Corresponding author. Department of General Neurology, Hertie Institute for Clinical Brain Research, University of Tqbingen, Hoppe- Seyler-Str. 3, 72076 Tqbingen, Germany. Fax: +49 7071 294371. E-mail address: dirk.wildgruber@med.uni-tuebingen.de (D. Wildgruber). Available online on ScienceDirect (www.sciencedirect.com). www.elsevier.com/locate/ynimg NeuroImage 24 (2005) 1233 – 1241