RESEARCH ARTICLE The effects of visual material and temporal synchrony on the processing of letters and speech sounds Maria Mittag Rika Takegata Teija Kujala Received: 14 July 2010 / Accepted: 6 April 2011 / Published online: 23 April 2011 Ó Springer-Verlag 2011 Abstract Associating letters with speech sounds is essential for reading skill acquisition. In the current study, we aimed at determining the effects of different types of visual material and temporal synchrony on the integration of letters and speech sounds. To this end, we recorded the mismatch negativity (MMN), an index of automatic change detection in the brain, from literate adults. Subjects were presented with auditory consonant–vowel syllable stimuli together with visual stimuli, which were either written syllables or scrambled pictures of the written syllables. The visual stimuli were presented in half of the blocks syn- chronously with the auditory stimuli and in the other half 200 ms before the auditory stimuli. The auditory stimuli were consonant, vowel or vowel length changes, or chan- ges in syllable frequency or intensity presented by using the multi-feature paradigm. Changes in the auditory stimuli elicited MMNs in all conditions. MMN amplitudes for the consonant and frequency changes were generally larger for the sounds presented with written syllables than with scrambled syllables. Time delay diminished the MMN amplitude for all deviants. The results suggest that speech sound processing is modulated when the sounds are pre- sented with letters versus non-linguistic visual stimuli, and further, that the integration of letters and speech sounds seems to be dependent on precise temporal alignment. Moreover, the results indicate that with our paradigm, a variety of parameters relevant and irrelevant for reading can be tested within one experiment. Keywords Audiovisual processing Á Mismatch negativity (MMN) Á Event-related potentials (ERPs) Á Letter-speech sound processing Á Multisensory integration Introduction Associating letters with speech sounds is essential for developing literacy skills. Research findings across lan- guages have shown a causal connection between phono- logical awareness and the reading development of a child (Snowling 1981; Goswami 2002). A child learning to read faces a mapping problem of how the alphabetic code sounds. The failure of forming the mapping between the alphabetic codes and sounds is considered as a main cause for developmental dyslexia (Siegel and Faux 1989; Snowling 1981). Hence, the investigation of how letters are integrated with speech sounds and what factors might influence this binding process is of great importance. Visual material has an impact on auditory perception as demonstrated by the so called ‘‘McGurk effect’’ in which seeing lip movements with hearing some specific syllables often causes an illusionary speech percept (McGurk and MacDonald 1976). Written syllables (Massaro et al. 1988) and words (Frost et al. 1988) were also found to influence auditory speech perception. For instance, it was shown that the identification of spoken syllables was influenced by lip reading and to some extent by written syllables (Massaro et al. 1988). Frost and others (1988) found that a spoken word masked by noise is detected faster and more accu- rately when the word is presented with its matching printed version than a non-matching print or a neutral visual stimulus. In the current study, we aim at examining the integration of written and heard syllables in literate adults by using the M. Mittag (&) Á R. Takegata Á T. Kujala Cognitive Brain Research Unit, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland e-mail: maria.mittag@helsinki.fi 123 Exp Brain Res (2011) 211:287–298 DOI 10.1007/s00221-011-2686-z