Computational Modeling of Visual Attention in Young Children with Autism Spectrum Disorders
Recent research using eye-tracking technology suggests that people with autism spectrum disorder (ASD) focus their gaze on the mouth and body rather than on the eyes of faces in a visual scene. These scanning patterns have been found to correlate with measures of social dysfunction. Therefore, characterizing and quantifying differences in how children with autism attend to different stimuli in a social scene can illuminate underlying (and harder to measure) deficits in social-cognitive functioning. Utilization of eye-tracking technology provides promise in better diagnostic and measurement techniques. In order to better describe patterns in eye gaze and visual processing in children with autism, the contextual cues in a social scene will be altered and gaze patterns measured. This includes changing the orientation of faces in a movie or turning the sound on and off. Eye gaze patterns will be measured while watching these videos to compute how individuals perceive and understand that scene under normal and disturbed conditions. The ultimate goal is to build a developmental model of visual preferences in those with and without autism, to better understand how these particular preferences lead to social dysfunction. What this means for people with autism: This is a unique opportunity for an engineer and student of computational modeling to bring methods and expertise to the Yale Child Study Center. The findings from this study will help pinpoint the underlying cause of specific differences in visual behaviors in people with autism and how those differences may lead to social deficits. In addition, this work could lead to new automated diagnoses of ASD based on computational analyses of gaze.