People with autism have communication difficulties that may result, in part, from difficulties in processing the nonverbal cues that accompany speech. Nonverbal facial and body gestures provide crucial information for understanding language and for early language development. Dr. Bennetto's lab has found that typically developing children benefit from this added nonverbal information, but children with autism do not. In fact, language comprehension is slowed for children with autism when gestures co-occur with speech. This study tests the hypothesis that children with autism do not automatically process nonverbal cues. Additionally, their difficulty in integrating verbal and nonverbal information may reflect differences in how individuals with autism integrate auditory and visual information. Dr. Bennetto and her pre-doctoral fellow will use functional magnetic resonance imaging (fMRI) to examine how the brains of children with high-functioning autism process gesture (communicative hand and arm movements) and speech. They will also determine whether the difficulties in integrating visual and auditory cues in autism are limited to speech signals, or if they extend to non-social situations as well. What this means for people with autism: Finding out how and where the brain integrates visual and auditory information related to speech is essential for understanding the communication deficits in people with autism. The results from this study will also inform the development of effective language interventions.