Human Behavior Lab


This cutting-edge project is in progress, aiming to open in September 2020. The Human Behavior Laboratory can help answer questions about human thoughts and emotions, and give us groundbreaking insights into our health, education and marketing tactics. The lab can quickly and accurately collect biometric data, including eye tracking, facial expression analysis (reading emotions), neural signals (electroencephalography), galvanic skin response (GSR), and heart and respiration rates. The result is a greater understanding of how context and emotions influence human behavior. The project will allow for a one-way mirror for observers to sit behind while an examiner works with the patient on the other side of the mirror.

Eye tracking studies have shown that typically developing humans have an innate tendency to attend to the central facial region of other humans regardless of scene complexity or salience or other, non-social objects and eye tracking have more recently been applied in the field of autism. Moreover, social processing is also observable through pupillometry, which provides direct information about the level of sympathetic arousal systems.

To participate in the research of Behavioral Analysis Using Virtual Reality fill the form through:

To participate in the research of Diagnosis Using Eye Tracking fill the form through:


We have developed a battery of tests based on cutting edge theory that will help objectively identify autism with good specificity and sensitivity. It is notable, that most studies attempting to evaluate eye tracking indices as diagnostic markers of autism have relied on single eye tracking metric, limiting both sensitivity and specificity, and not allowing to reflect for the multidimensional aspect of the disorder, which cannot be captured by a single variable.

We believe that combining several eye tracking and pupillometric indices based on theoretically motivated paradigms will increase the accuracy of autism identification. The whole battery will take no longer than 8-10 minutes, in order to ensure feasibility and compliance. It will be developed chiefly with young children in mind (toddlers and preschool age), but also possible to use for school-aged children, adolescents and young adults. Norms will need to be established for the different age groups. For the purposes of diagnosis, clinicians will observe and note the following: (visual disengagement, joint attention, autonomic reaction to noise, gaze latency to social information, gaze during face perception, eye-gaze avoidance)


A secondary aim of this project is to provide interventions for individuals diagnosed with ASD. We will examine and validate the outcomes of applying virtual reality (VR) to provide behavioral intervention services. We will provide assistance for children with ASD to be exposed to different VR environments than that which matches their daily lives activities and apply interventions. Children will have opportunities to practice life skills such as crossing the street and practice and learn the steps to do so successfully in a therapeutic environment. Furthermore, the study aims to validate measurements of eye-tracking in VR and examine differences in fixation durations and counts on specific items presented in VR.

We hypothesize that a behavioral intervention presented to children with autism spectrum disorder will lead to improvements in their ability to safely participate in activities of daily life. Moreover, we hypothesize that training in VR will lead to changes in how visual attention is distributed to important items within the VR environments.

Contatct Us