Eye spy: The predictive value of fixation patterns in detecting subtle and extreme emotions from faces Avinash R. Vaidya ⇑ , Chenshuo Jin, Lesley K. Fellows Montreal Neurological Institute, Dept. of Neurology & Neurosurgery, McGill University, 3801 University St., Montreal, QC H3A 2B4, Canada article info Article history: Received 26 November 2013 Revised 9 July 2014 Accepted 10 July 2014 Keywords: Emotion Face perception Expression Modeling Eye-tracking Fixation abstract Successful social interaction requires recognizing subtle changes in the mental states of others. Deficits in emotion recognition are found in several neurological and psychiatric ill- nesses, and are often marked by disturbances in gaze patterns to faces, typically inter- preted as a failure to fixate on emotionally informative facial features. However, there has been very little research on how fixations inform emotion recognition in healthy peo- ple. Here, we asked whether fixations predicted detection of subtle and extreme emotions in faces. We used a simple model to predict emotion detection scores from participants’ fix- ation patterns. The best fit of this model heavily weighted fixations to the eyes in detecting subtle fear, disgust and surprise, with less weight, or zero weight, given to mouth and nose fixations. However, this model could not successfully predict detection of subtle happiness, or extreme emotional expressions, with the exception of fear. These findings argue that detection of most subtle emotions is best served by fixations to the eyes, with some con- tribution from nose and mouth fixations. In contrast, detection of extreme emotions and subtle happiness appeared to be less dependent on fixation patterns. The results offer a new perspective on some puzzling dissociations in the neuropsychological literature, and a novel analytic approach for the study of eye gaze in social or emotional settings. Ó 2014 Elsevier B.V. All rights reserved. 1. Introduction Day to day social situations require us to continuously interpret the emotional states of individuals with whom we interact. We use information from many sources in forming these interpretations, including body language, tone of voice and contextual factors (Barrett, Lindquist, & Gendron, 2007; Meeren, van Heijnsbergen, & de Gelder, 2005). The communication of emotional state through facial expressions has long been of particular interest, as stereotyped emotional expressions are well conserved across species and are thought to be universal among humans (Darwin, 1896; Ekman & Friesen, 1971). Recognizing these basic emotions requires searching for and detecting the emotional content in a face. Expressive information is largely conveyed through dynamic changes in facial features such as the width of the eyes, position of the jaw, or the curving of the lips (Calder, Burton, Miller, Young, & Akamatsu, 2001). The distinct pattern of features involved in each expression suggests that sampling of information-rich features might be an effective strategy for distinguishing between facial emotions. Smith, Cottrell, Gosselin, and Schyns (2005) confirmed that indi- vidual features observed in isolation are more or less useful in distinguishing between basic emotional expressions (e.g. eyes were more useful for fear, mouth for happiness), by requiring participants to judge an emotional expression where only parts of the face were visible (Bubbles method; http://dx.doi.org/10.1016/j.cognition.2014.07.004 0010-0277/Ó 2014 Elsevier B.V. All rights reserved. ⇑ Corresponding author. Present address: Rm 276, Montreal Neurolog- ical Institute, McGill University, 3801 University St., Montreal, QC H3A 2B4, Canada. Tel.: +1 514 398 2083. E-mail address: avinash.vaidya@mail.mcgill.ca (A.R. Vaidya). Cognition 133 (2014) 443–456 Contents lists available at ScienceDirect Cognition journal homepage: www.elsevier.com/locate/COGNIT