Individuals with autism spectrum disorders (ASDs) often have difficulty perceiving emotional cues such as facial expression, tone of voice, or body posture, which contributes to impairments in social skills. Previous research has shown that while typically developing individuals automatically mimic facial expressions when viewing pictures of emotional facial expressions, individuals with ASD tend not to respond in this way. The present study will examine automatic facial mimicry in 20 individuals with ASD in response to both visual and audio emotional cues. Rather than still photos, more realistic audio-video cues will be used. Facial movement and expression in response to cues will be measured using sensors attached to the face. Researchers will determine whether deficits in automatic facial mimicry in correlate with emotion perception and social engagement. This research will clarify whether facial mimicry is involved in emotional perception in, and may contribute to our understanding of deficits in social skills in ASD.