Autistic children fail to develop skills related to social signal processing, which may be attributed to their inability to attend to social cues mediated through eye gaze. Like humans, macaque monkeys produce unique facial expressions when producing different vocal signals and they can also perceptually match the appropriate facial posture with a vocalization. The eye movement patterns that monkeys use to process these ‘multisensory' social inputs are also identical to those used by human adults and children when they view human faces producing speech. This research seeks to shed light on the neurobiology of communication signals and deficits in autistic children using macaque monkeys as a model system. Significance: Using this animal model, the researchers will be able to better define the role of functional interactions between cortical areas during multisensory integration of faces and voices.