AUTOMATIC EMOTION INDUCTION AND ASSESSMENT FRAMEWORK Enhancing User Interfaces by Interperting Users Multimodal Biosignals Jorge Teixeira, Vasco Vinhas, Luís Paulo Reis and Eugénio Oliveira FEUP - Faculdade de Engenharia da Universidade do Porto, Rua Dr. Roberto Frias s/n, Porto, Portugal DEI - Departamento de Engenharia Informática, Rua Dr. Roberto Frias s/n, Porto, Portugal LIACC - Laboratório de Inteligência Artificial e Ciência de Computadores, Rua do Campo Alegre 823, Porto, Portugal jtf@fe.up.pt, vvm@fe.up.pt, lpreis@fe.up.pt, eco@fe.up.pt Keywords: Biosignals, Emotions, Classification, Multimedia, Clustering. Abstract: Emotion’s definition, identification, systematic induction and efficient and reliable classification have been themes to which several complementary knowledge areas such as psychology, medicine and computer science have been dedicating serious investments. This project consists in developing an automatic tool for emotion assessment based on a dynamic biometric data acquisition set as galvanic skin response and electroencephalography are practical examples. The output of standard emotional induction methods is the support for classification based on data analysis and processing. The conducted experimental sessions, alongside with the developed support tools, allowed the extraction on conclusions such as the capability of effectively performing automatic classification of the subject’s predominant emotional state. Self assessment interviews validated the developed tool's success rate of approximately 75%. It was also experimentally strongly suggested that female subjects are emotionally more active and easily induced than males. 1 INTRODUCTION Emotions play an important role in all human activities, from the trivial to the most complex ones. This significance is translated both in terms of reality perception and even in the cognitive decision process. Meanwhile, computers have gained such a relevant presence in the modern society that they have been introduced in almost every aspect of it, enhancing the magnitude of ubiquitous computing. Having these two realities in mind – the importance of emotional states and the necessity of daily interaction with multiple devices – merging them would be a great improvement. By providing the distributed computer systems with the perception of their users’ emotions, the applications would be able to adjust their interface, promote and suggest functionalities accordingly. It is believed that this approach would increase the global system’s transparency and efficiency as its dynamism would follow in the encounter to user’s intentions and temper. Alongside the ubiquitous computing, multimedia contents are becoming constantly more complex and seemlier to reality, enabling a greater action immersion sensation, the primitive absolute need of achieving a perfect match between audiovisual contents and the audience desires is still present and constitutes the main key to the industry success. The alliance between the multimedia contents choice possibility that enables the audience to individually presence what desires and accurate emotional states detection systems leads to subconscious individual interaction between the audience and the multimedia control system, potentiating the perfect match between content and individual audience desires. This study illustrates a proposal for an application that enables automatic emotional state assessment using minimal invasive solutions. The rest of the paper is organized as follows: in the next section the current state of the art is presented; in section 3 a project description is given; in section 4 the study’s results are depicted and, consequently, the project’s conclusions are listed and future work areas are identified in the final section.