Research projects

How do infants individuate human faces? 

This project aims to understand how young infants discern and recognize different types of faces based on traits such as race, gender, personal identity, and age. Through an interactive reaching game created for this project, infants have the opportunity to form expectations about how many individual faces are hidden within a box. We are interested in whether there are differences in the type of traits that infants may be more focused on compared to others and if there are any age-related changes between 1 and 2 years of age. Our previous work shows that 1- and 2-year-old infants make distinctions between human and non-human objects. Interestingly, we found an age-related change in distinctions when female and male looking faces are shown. Specifically we have found that 2-year-old infants are more likely to expect 2 faces are involved in the game when they see a female face followed by a male face, however 1-year-old infants were not found to make any clear distinctions between these types of faces. For the full paper go to our publications page.

Study Status

We are currently recruiting participants! We are looking for infants who are either 11 to 13 months OR 23 – 25 months of age. Infants participate in an interactive reaching game while sitting on their parents’ or caregivers’ lap. 

Click here for more information and to learn how to participate in this study. 

How does the brain respond to facial expressions? 

This project examines how 6-month-old infants and college students perceive different faces. These faces vary according to race, gender, and emotions. We are particularly interested in the way attention is shown toward fearful compared to neutral facial expressions. Participants’ eye movements are tracked via a Tobii eye tracker as they are shown images of faces. At the same time, electrical brain activity is being recorded through an electroencephalography (EEG) cap.

Study status

Data collection is finished for this study. We are currently working on processing and analyzing the eye tracking and EEG data.