Newopsychologia. Vol. 30, No. 4, pp. 329-340, 1992. Printed in Great Britain. COZE-3932/92 $5.00+0.CKI 0 1992 Pergamon Press Ltd ACQUISITION OF SIGNS FROM AMERICAN SIGN LANGUAGE IN HEARING INDIVIDUALS FOLLOWING LEFT HEMISPHERE DAMAGE AND APHASIA STEVEN W. ANDERSON,* HANNA DAMASIO,* ANTONIO R. DAMASIO,*~ EDWARD KLIMA,~$ URSULA BELLUGI~ and JOANP. BRANDT* *Division of Behavioral Neurology and Cognitive Neuroscience, Department of Neurology, The University of Iowa College of Medicine, IO& City, Iowa; tLaboratory for Language and Cognitive-Studies, The Salk Institute for Biological Studies, La Jolla, California and SDepartment of Linguistics, University of California, San Diego, La Jolla, California, U.S.A. (Received 28 August 1991; accepted 6 November 1991) Abstract-Three severely aphasic hearing patients with no prior knowledge of sign language were able to acquire competency in aspects of American Sign Language (ASL) lexicon and finger spelling, in contrast to a near complete inability to speak the Engliih counterparts of these visuo-gestural signs. Two patients with damage in left postero-lateral temporal and inferior parietal cortices mastered production and comprehension of single signs and short meaningful sign sequences, but the one patient with damage to virtually all left temporal cortices was less accurate in single sign processing and was unable to produce sequences of signs at all. These findings suggest that conceptual knowledge is represented independently of the auditory-vocal records for the corresponding lexical entries, and that left anterior temporal cortices outside of traditional “language areas” are part of the neural network which supports the linkage between conceptual knowledge and linguistic signs, especially as they are used in the sequenced activations required for production or comprehension of meaningful sentences. UNTIL RECENTLY, virtually all information regarding the brain basis of linguistic abilities has come from the study of spoken languages. There is little doubt that aspects of any given language are shaped by the sensory and motor apparatus involved in the expression of that language, and that many questions remain regarding the role of the auditory-vocal channel in the linguistic structures, abilities, and impairments investigated after brain damage. However, by studying the processing of a language based on visual signs following brain damage, we hoped to obtain an additional perspective on the neural basis of language. We focused on American Sign Language (ASL), an autonomous language with organizational principles similar to those of English and other auditory-vocal languages, but with a mode of transmission and with linguistic mechanisms that have evolved within the framework of a visual-gestural symbolic system. Two of the primary levels of linguistic organization in ASL are the combining of sublexical elements into meaningful units, and spatially organized syntax. The former refers to the composition of individual, meaningful signs from combination of a limited set of recurring sublexical components. The latter refers to the specification of semantic and syntactic relations among signs by means of manipulating signs in space. The present study focuses primarily on the first level of ASL organization. The combining of sublexical units is analogous to the combination of phonemes into meaningful units in spoken languages. In ASL, individual signs which serve as referents for 329