PS_2.099 - Spatial and temporal dynamics of lexico-semantic processing in American Sign Language

Leonard, M. , Ferjan Ramirez, N. , Torres, C. , Hatrak, M. , Mayberry, R. & Halgren, E.

University of California, San Diego

It has been demonstrated that written and auditory words evoke lexico-semantic neural activity in a mostly left hemisphere fronto-temporal network between ~200-500 ms, suggesting that the brain areas that process meaning are modality-independent. Is the same true when one’s first language is acquired in a visuo-motor modality, as is the case in congenitally deaf individuals who learn sign language? Using a multimodal imaging approach that combines the temporal resolution of magnetoencephalography (MEG) and the spatial resolution of MRI, we examined a group of native deaf signers of American Sign Language (ASL). We presented signs that were either matched or mismatched with a picture of an object, and localized the activity that occurs ~400 ms after the onset of the sign videos to a similar left fronto-temporal network as speech. Our results agree with previous research using hemodynamic and lesion methods, but add a crucial temporal component, which demonstrates that the similar neural substrate for sign and speech are driven by a similar temporal dynamics.