Sign Language and the Brain

Two prominent issues concerning brain organization in deaf users of SIGN LANGUAGES are whether deaf individuals show complementary HEMISPHERIC SPECIALIZATION for language and nonlanguage visuo-spatial skills, and whether classical language areas within the left hemisphere participate in sign-language processing. These questions are especially pertinent given that signed languages of the deaf make significant use of visuo-spatial mechanisms to convey linguistic information. Thus sign languages exhibit properties for which each of the cerebral hemispheres show specialization: visuo-spatial processing and language processing. Three sources of evidence have commonly been used to investigate brain organization in signers: behavioral studies using tachistoscopic visual half-field paradigms, studies of signers who have incurred focal brain damage, and, more recently, neural imaging studies of normal signing volunteers. Many of these studies have investigated American Sign Language (ASL), which is but one of the many naturally occurring signed languages of the world. These studies have provided insight into the determination of hemispheric specialization and the contribution of environmental and biological factors in the establishment of neural systems mediating human language.

The behavioral literature on cerebral lateralization for sign language is based largely upon tachistoscopic visual half-field studies. As a whole, these studies yield inconsistent and contradictory findings, ranging from reports of right-hemisphere dominance, left-hemisphere dominance, and no hemsipheric asymmetries for sign language processing in the deaf. Methodological factors such as variability in inclusion criteria for deaf subjects (e.g., etiology and degree of hearing loss), variability in language background and schooling (e.g., native signers, nonnative signers, oral schooling, sign-based schooling) and stimulus characteristics (e.g., manual-alphabet handshapes, static drawings of ASL signs and moving signs) contribute to the wide range of findings. Discussion here will be limited to studies using profoundly deaf, native signing adults as subjects. Poizner, Battison, and Lane (1979) compared the contribution of movement in sign language stimuli. They reported a left visual field (LVF) advantage for static signs and no hemispheric asymmetry for moving signs. In a study comparing depth of processing, Grossi et al. (1996) reported no hemisphere asymmetry for judgments of signs based on physical characteristics. However, a significant RVF advantage emerged when subjects were asked to make judgments of handshapes that matched in morphological relationships to one another. Emmorey and Corina (1993) tested hemispheric specialization using a lexical decision paradigm for moving signs that varied in imagability. Deaf signers and hearing English speakers both showed a left-hemisphere advantage for abstract lexical items. English speakers exhibited no visual field effect for imageable words; however, deaf signers showed a significant right-hemisphere advantage for imageable signs. These studies suggest that hemispheric asymmetries are more likely to be elicited from moving sign stimuli as well as when deaf subjects are engaged in higher level lexical processing of sign stimuli. Under these constraints, patterns of left-hemisphere dominance for language may emerge.

In 1878, Hughlings Jackson wrote, "No doubt by disease of some part of his brain the deaf-mute might lose his natural system of signs" (p. 304). Since this time, roughly 25 individual case studies of signers with brain injury have been reported (see Corina 1998 for a recent review). However, many of the early case studies were compromised by a lack of understanding of the relationships among systems of communication used by deaf individuals. For example, several of the early studies compared disruptions of fingerspelling and only briefly mentioned or assessed sign-language use. More recently, well-documented case studies have begun to provide a clearer picture of the neural systems involved in sign-language processing. From these later studies, it becomes evident that right-handed deaf signers, like hearing persons, exhibit APHASIA when critical left-hemisphere areas are damaged (Poizner, Klima, and Bellugi 1987). Approximately one dozen case studies provide sufficient detail to implicate left-hemisphere structures in sign-language disturbances. A subset of cases provide neuro-radiological or autopsy reports to confirm left-hemisphere involvement, and they provide compelling language assessment to implicate aphasic language disturbance. As well, five cases of signers with right-hemisphere pathology have been reported. All five of these signers showed moderate to severe degrees of nonlinguistic visuo-spatial impairment accompanied by relatively intact sign-language skills. In contrast, none of the left-hemisphere - damaged signers tested on nonlinguistic visuo-spatial tests were shown to have significant impairment. Taken together, these findings suggest that deaf signers show complementary specialization for language and nonlanguage skills. These studies demonstrate that the development of hemispheric specialization is not dependent upon exposure to oral/aural language.

Disruptions in sign-language ability following left-hemisphere damage are similar to those patterns found in hearing users of spoken languages. For example, execution of speech movements involves the cortical zone encompassing the lower posterior portion of the left frontal lobe (Goodglass 1993). Left-hemisphere posterior frontal regions also are implicated in sign-language production. A representative case is that of Poizner, Klima, and Bellugi's (1987) subject G. D., who had damage to Broadman's areas 44 and 45 of the left frontal lobe. This subject's signing was effortful and dysfluent, reduced largely to single-sign utterances, but her sign-language comprehension remained unimpaired. In hearing individuals, severe language comprehension deficits are associated with left-hemisphere posterior temporal lesions. Similar patterns have been observed in users of signed languages. For example, W. L., who suffered damage to the posterior temporal area, was globally aphasic for ASL (Corina et al. 1992). W. L. evidenced marked comprehension deficits and showed a gradation of impairment, with some difficulty in single-sign recognition, moderate  impairment in following commands, and severe problems with complex ideational material. W. L.'s sign production remained moderately fluent, but he made numerous sign-language "phonemic" paraphasias. Phonemic paraphasias arise from substitutions or omissions of sublexical phonological components. In ASL, sublexical structure refers to the formational elements that comprise a sign form: handshape, location, movement, and orientation. A common form of sign paraphasic error involves the incorrect use of handshape for a given sign. Importantly, W. L. also showed intact production and comprehension of nonlinguistic pantomime. Thus, although profoundly aphasic for linguistic properties of ASL, W. L. was motorically facile and capable of producing and understanding nonlinguistic pantomime. Taken together, these findings provide evidence that language impairments following stroke in deaf signers follow the characteristic pattern of left frontal damage leading to nonfluent output with spared comprehension, whereas left posterior lesions yield fluent output with impaired language comprehension. Dissociation between nonlinguistic pantomime skills and language use further demonstrates that these impairments are aphasic in nature and do not reflect general problems in symbolic conceptualization or motor behavior.

Functional imaging techniques have been used to examine sign-language representation in the brain. Comparisons of sentence processing for written English and ASL reveal both commonalties and differences across hearing nonsigners and native users of sign language. A functional MAGNETIC RESONANCE IMAGING (fMRI) study by Neville et al. (1998) shows that when hearing or deaf subjects process their native languages (ASL or English), classical anterior and posterior language areas within the left hemisphere are recruited. This finding is consistent with data from studies of spoken- and sign-language aphasia. These results suggest that the early acquisition of a fully grammatical, natural language is important in the specialization of these systems. However, unlike patterns observed for English processing, when deaf and hearing native signers process sentences in ASL, robust activation is also observed in right hemisphere prefrontal regions and posterior and anterior parts of the superior temporal sulcus. These findings imply that the specific nature and structure of ASL results in the recruitment of the right hemisphere into the language system. Recent electrophysiological studies of neurologically intact native signers also indicate that both the left and right hemispheres are active during ASL sentence processing (Neville et al. 1997). These results suggest that activation within the right hemisphere may be specifically linked to the linguistic use of space. The degree of right-hemisphere activation observed in these studies is surprising given the lack of significant aphasic symptomology reported in right-hemisphere - damaged signers.

Taken together, studies of the neural basis of sign- language processing highlight the presence of strong biases that left inferior frontal and posterior temporal parietal regions of the left hemisphere are well suited to process a natural language independent of the form of the language, and they reveal that the specific structure and processing requirements of the language also, in part, determine the final form of the language systems of the brain.

See also

Additional links

-- David P. Corina

References

Corina, D. P. (1998). The processing of sign language: evidence from aphasia. In H. Whitaker and B. Stemmer, Eds., Handbook of Neurology. San Diego, CA: Academic Press.

Corina, D. P., H. P. Poizner, T. Feinberg, D. Dowd, and L. O'Grady. (1992). Dissociation between linguistic and nonlinguistic gestural systems: A case for compositionality. Brain and Language 43:414-447.

Emmorey, K., and D. Corina. (1993). Hemispheric specialization for ASL signs and English words: Differences between image able and abstract forms. Neuropsychologia 31(7):645-653.

Goodglass, H. (1993). Understanding Aphasia. San Diego, CA: Academic Press.

Grossi, G., C. Semenza, S. Corazza, and V. Volterra. (1996). Hemispheric specialization for sign language. Neuropsychologia 34(7):737-740.

Jackson, J. H. (1878). On affections of speech from disease of the brain. Brain 1: 64. Reprint Selected Writings of Hughlings Jackson. J. Taylor, Ed. London. Vol. 2, 1932.

Neville, H. J., D. Bavelier, D. P. Corina, J. P. Rauschecker, A. Karni, A. Lalwani, A. Braun, V. P. Clark, P. Jezzard, and R. Turner. (1998). Cerebral organization for language in deaf and hearing subjects: Biological constraints and effects of experience. Proceedings of the National Academy of Science 95:922-929.

Neville, H. J., S. A. Coffey, D. S. Lawson, A. Fischer, K. Emmorey, and U. Bellugi. (1997). Neural systems mediating American Sign Language: Effects of sensory experience and age of acquisition. Brain and Language 57(3):285-308.

Poizner, H., R. Battison, and H. Lane. (1979). Cerebral asymmetry for American Sign Language: The effects of moving stimuli. Brain and Language 7(3):351-362.

Poizner, H., E. S. Klima, and U. Bellugi. (1987) What the Hands Reveal about the Brain. Cambridge, MA: MIT Press.

Further Readings

Corina, D. P., M. Kritchevsky, and U. Bellugi. (1996). Visual language processing and unilateral neglect: Evidence from American Sign Language. Cognitive Neuropsychology 13(3):321-351.

Corina, D. P. (1998). Aphasia in users of signed languages. In P. Coppens, Y. Lebrun, and A. Basso, Eds., Aphasia in Atypical Populations. Hillsdale, NJ: Erlbaum.

Corina, D. P., J. Vaid, and U. Bellugi. (1992). Linguistic basis of left hemisphere specialization. Science 225:1258-1260.

Emmorey, K. (1996). The confluence of space and language in signed languages. In P. Bloom, M. Peterson, L. Nadel, and M. Garrett, Eds., Language and Space. Cambridge, MA: MIT Press, pp. 171-209.

Hickok, G., U. Bellugi, and E. S. Klima. (1996). The neurobiology of sign language and its implications for the neural basis of language. Nature 381(6584):699-702.

Kimura, D. (1981). Neural mechanisms in manual signing. Sign Language Studies 33:291-312.

Kegl, J., and H. Poizner. (1991). The interplay between linguistic and spatial processing in a right-lesioned signer. Journal of Clinical and Experimental Neuropsychology 13:38-39.

Neville, H. J. (1990). Intermodal competition and compensation in development: Evidence from studies of the visual system in congenitally deaf adults. Annals of the New York Academy of Sciences 608:71-87.

Poizner, H., and J. Kegl. (1992). Neural basis of language and motor behavior: Perspectives from American Sign Language. Aphasiology 6(3):219-256.

Soderfeldt, B., J. Ronnberg, and J. Risberg. (1994). Regional cerebral blood flow in sign language users. Brain and Language 46:59-68.