SPECIAL SECTION ON NEW TRENDS IN BRAIN SIGNAL PROCESSING AND ANALYSIS Received November 1, 2018, accepted November 17, 2018, date of publication November 23, 2018, date of current version January 4, 2019. Digital Object Identifier 10.1109/ACCESS.2018.2883213 Using Deep Convolutional Neural Network for Emotion Detection on a Physiological Signals Dataset (AMIGOS) LUZ SANTAMARIA-GRANADOS 1 , MARIO MUNOZ-ORGANERO 2 , (Member, IEEE), GUSTAVO RAMIREZ-GONZÁLEZ 3 , ENAS ABDULHAY 4 , AND N. ARUNKUMAR 5 1 Faculty of Systems Engineering, Universidad Santo Tomás, Tunja 110311, Colombia 2 Telematics Engineering Department, UC3M-BS Institute of Financial Big Data, Universidad Carlos III de Madrid, 28911 Leganes, Spain 3 Telematics Department, University of Cauca, Popayán, Colombia 4 Department of Biomedical Engineering, Faculty of Engineering, Jordan University of Science and Technology, Irbid 22110, Jordan 5 Department of Electronics and Instrumentation, SASTRA University, Thanjavur 613401, India Corresponding author: Gustavo Ramirez-González (gramirez@unicauca.edu.co) This work was supported in part by the Government of Colombia, in part by Colciencias, and in part by the Governorate of Boyacá. ABSTRACT Recommender systems have been based on context and content, and now the technological challenge of making personalized recommendations based on the user emotional state arises through physiological signals that are obtained from devices or sensors. This paper applies the deep learning approach using a deep convolutional neural network on a dataset of physiological signals (electrocardiogram and galvanic skin response), in this case, the AMIGOS dataset. The detection of emotions is done by correlating these physiological signals with the data of arousal and valence of this dataset, to classify the affective state of a person. In addition, an application for emotion recognition based on classic machine learning algorithms is proposed to extract the features of physiological signals in the domain of time, frequency, and non-linear. This application uses a convolutional neural network for the automatic feature extraction of the physiological signals, and through fully connected network layers, the emotion prediction is made. The experimental results on the AMIGOS dataset show that the method proposed in this paper achieves a better precision of the classification of the emotional states, in comparison with the originally obtained by the authors of this dataset. INDEX TERMS Emotion recognition, deep convolutional neural network, physiological signals, machine learning, AMIGOS dataset. I. INTRODUCTION During the last two decades, the MIT’s affective com- puting research group has aroused great interest in scien- tific and academic communities that seek to improve the human emotional experience with technology [1]. Some challenges focus on deepening machine learning and deep algorithms, to ensure that the emotion recognition sys- tem has a high precision and robustness in the process- ing of physiological data [2]. The emotions computational models [3] have been applied to the recognition of affective states through physiological measures, such as Heart Rate Variability (HRV), Blood Volume Pulse (BVP), Skin Temperature (SKT) [4], Electrocardiogram (ECG), and Electrodermal Activity (EDA) [5], that come from the peripheral nervous system and central nervous system. Affective states are subjective experiences classified in valence and arousal focuses [6]. Similarly, both focuses reflect the degree to which a person incorporates emotions into their conscious affective experience [7]. The stimulus of valence focus is associated with pleasurable or unpleasant aspects, in contrast with arousal focus that induces the acti- vation or deactivation of an emotion. Some databases corre- late the affective states with physiological signals [8]–[10], which are the result of emotions self-reported by people. The emotional categories are established in a circular struc- VOLUME 7, 2019 2169-3536 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information. 57