This research project will build on current Autism Speaks funding by bringing on a graduate student who brings both clinical training from Yale Child Study Center and imaging experience from the University of California at San Diego. Dr. Pelphrey and Ms. Carter will be examining brain activity during joint attention tasks in children with autism. Joint attention and deficits in eye gaze are two hallmarks of what is termed “Social perception”. People with autism demonstrate striking abnormalities in social perception. Researchers have just begun to illustrate and explain the neural systems involved in aspects of social perception which are controlled by the “social brain”, and what the social brain looks like in individuals with autism is poorly understood. In order to better characterize the neural circuitry behind eye gaze and joint attention, children with autism and those not affected will be presented with a novel virtual reality paradigm during a functional MRI assessment to evaluate which brain areas participate in these tasks. What this means for people with autism: This research will help identify the specific neurophysiological mechanisms involved in social perception, leading to an improved evaluation of novel therapeutic interventions. In addition, this and other research on social perception development in autism will help develop new treatment models that traditionally are skill-based to focus more on teaching children with autism to monitor gaze, identify emotions, and infer the intentions of others.