Research papers Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex Julien Besle a,b,c, * , Olivier Bertrand a,b , Marie-Hélène Giard a,b a INSERM, U821, Brain Dynamics and Cognition, 69500 Lyon, France b Université Lyon 1, 69000 Lyon, France c Université Lyon 2, 69000 Lyon, France article info Article history: Received 31 March 2009 Received in revised form 24 June 2009 Accepted 25 June 2009 Available online 30 June 2009 Keywords: Auditory Visual Multisensory Intracranial Speech Human ERPs abstract In this review, we examine the contribution of human electrophysiological studies (EEG, sEEG and MEG) to the study of visual influence on processing in the auditory cortex. Focusing mainly on studies per- formed by our group, we critically review the evidence showing (1) that visual information can both acti- vate and modulate the activity of the auditory cortex at relatively early stages (mainly at the processing stage of the auditory N1 wave) in response to both speech and non-speech sounds and (2) that visual information can be included in the representation of both speech and non-speech sounds in auditory sen- sory memory. We describe an important conceptual tool in the study of audiovisual interaction (the addi- tive model) and show the importance of considering the spatial distribution of electrophysiological data when interpreting EEG results. Review of these studies points to the probable role of sensory, attentional and task-related factors in modulating audiovisual interactions in the auditory cortex. Ó 2009 Elsevier B.V. All rights reserved. 1. Introduction In this review, we will examine the contribution of human elec- trophysiological studies to the study of audiovisual interactions in the auditory cortex. Temporally precise information on the global activity of one or several populations of cortical cells, to the order of the millisecond, may be harnessed by electrophysiological recordings using electro-encephalography (EEG) or magneto- encephalography (MEG) in normal human participants and stereotaxic EEG (sEEG) in epileptic patients. Due to this excellent temporal resolution and the high temporal definition of the cellular processes assumed to be measured (post-synaptic membrane potential changes), these studies have commonly focused on the timing of the crossmodal interactions, with emphasis on the early integration processes, asking how early can visual information influence what’s happening in the auditory cortex. Another reason for focusing on early stages of the integration process is that the cri- terion used to detect multisensory interaction in most EEG/MEG studies is probably only valid within the first 200 ms of processing of the stimuli (Besle et al., 2004a). Importantly, whole-head EEG and MEG studies can also give some insight into the spatial localization of interaction effects and this is very important for localizing them in the auditory cor- tex: activation of parts of the human auditory cortex (mostly those in the supra-temporal plane) result in a very characteristic distri- bution of evoked potentials with a polarity reversal between elec- trodes on the mastoids and fronto-central sites (or orthogonal to the supra temporal plane in MEG). Whenever such topography is seen, chances are that the underlying generator is auditory. In the first part of this review, we will see how this temporal and spatial information was used to describe integration processes in the human auditory cortex during the perception of audiovisual ob- jects and audiovisual speech. In the second part, we will show how the MMN, an electrophysiological index of the auditory sensory memory can be used to study the inclusion of visual information in the representation of an audiovisual event in the auditory cortex. 2. Application of the additive model to the study of audiovisual interactions in EEG/MEG/sEEG 2.1. The additive model Most electrophysiological studies of multisensory interactions in the human have used a common approach (for reviews, see: Besle et al., 2004a; Calvert, 2001) involving recording the EEG in 0378-5955/$ - see front matter Ó 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.heares.2009.06.016 * Corresponding author. Address: INSERM, U821, Brain Dynamics and Cognition, Université Lyon 1, 69500 Lyon, France. E-mail address: julien.besle@inserm.fr (J. Besle). Hearing Research 258 (2009) 143–151 Contents lists available at ScienceDirect Hearing Research journal homepage: www.elsevier.com/locate/heares