Exploring Eye Activity as an Indication of Emotional States Using an Eye-tracking Sensor Sharifa Alghowinem 1,4 , Majdah AlShehri 2 , Roland Goecke 3,1 , and Michael Wagner 3,1 1 Australian National University, Canberra, Australia, 2 King Saud University, Riyadh, Saudi Arabia, 3 University of Canberra, Canberra, Australia, 4 Ministry of Higher Education: Kingdom of Saudi Arabia Abstract. The automatic detection of human emotional states has been of great interest lately for its applications not only in the Human-Computer Interaction field, but also for its applications in psychological studies. Using an emotion elic- itation paradigm, we investigate whether eye activity holds discriminative power for detecting affective states. Our emotion elicitation paradigm includes induced emotions by watching emotional movie clips and spontaneous emotions elicited by interviewing participants about emotional events in their life. To reduce gen- der variability, the selected participants were 60 female native Arabic speakers (30 young adults, and 30 mature adults). In general, the automatic classification results using eye activity were reasonable, giving 66% correct recognition rate on average. Statistical measures show statistically significant differences in eye activity patterns between positive and negative emotions. We conclude that eye activity, including eye movement, pupil dilation and pupil invisibility could be used as a complementary cues for the automatic recognition of human emotional states. Keywords: Affective computing, eye tracking, emotion recognition 1 Introduction Affective computing – the study of automatic recognition of human emotional states and their utilisation in a computer system – has had much interest lately due to its mul- tidisciplinary applications. For example, Human-Computer Interaction (HCI) is con- cerned with enhancing the interactions between users and computers by improving the computer’s understanding of the user’s needs, which include understanding the user’s emotional state [23]. In the education field, understanding the affective state of a stu- dent could lead to more effective presenting style and improved learning [7]. A current interest is in the personalisation of commercial products, which could be enhanced by understanding the client’s preference based on their mood [31]. Moreover, such under- standing of the user’s emotions could enhance other applications such as virtual reality and smart surveillance [29]. Such automatic recognition of emotions could also be use- ful to support psychological studies. For example, such studies could give a baseline for the emotional reaction of healthy subjects, which could be compared and used to diagnose mental disorders such as autism [14] or depression [1].