RESEARCH ARTICLE Evaluation of electrocardiogram: numerical vs. image data for emotion recognition system [version 1; peer review: awaiting peer review] Sharifah Noor Masidayu Sayed Ismail 1 , Nor Azlina Ab. Aziz 2 , Siti Zainab Ibrahim 1 , Sophan Wahyudi Nawawi 3 , Salem Alelyani 4,5 , Mohamed Mohana 4 , Lee Chia Chun 6 1 Faculty of Information Science & Technology, Multimedia University, Bukit Beruang,, Melaka, 75450, Malaysia 2 Faculty of Engineering, Multimedia University, Bukit Beruang, Melaka, 75450, Malaysia 3 School of Electrical Engineering, Faculty of Engineering, Universiti Teknologi Malaysia, Skudai, Johor Bahru, 81310, Malaysia 4 Center for Artificial Intelligence, King Khalid University, Abha, 61421, Saudi Arabia 5 College of Computer Science, King Khalid University, Abha, 61421, Saudi Arabia 6 Hexon Data Sdn Bhd, Kuala Lumpur, 59200, Malaysia First published: 04 Nov 2021, 10:1114 https://doi.org/10.12688/f1000research.73255.1 Latest published: 04 Nov 2021, 10:1114 https://doi.org/10.12688/f1000research.73255.1 v1 Abstract Background: The electrocardiogram (ECG) is a physiological signal used to diagnose and monitor cardiovascular disease, usually using ECG wave images. Numerous studies have proven that ECG can be used to detect human emotions using numerical data; however, ECG is typically captured as a wave image rather than as a numerical data. There is still no consensus on the effect of the ECG input format (either as an image or a numerical value) on the accuracy of the emotion recognition system (ERS). The ERS using ECG images is still inadequately studied. Therefore, this study compared ERS performance using ECG image and ECG numerical data to determine the effect of the ECG input format on the ERS. Methods: This study employed the DREAMER dataset, which contains 23 ECG recordings obtained during audio-visual emotional elicitation. Numerical data was converted to ECG images for the comparison. Numerous approaches were used to obtain ECG features. The Augsburg BioSignal Toolbox (AUBT) and the Toolbox for Emotional feature extraction from Physiological signals (TEAP) extracted features from numerical data. Meanwhile, features were extracted from image data using Oriented FAST and rotated BRIEF (ORB), Scale Invariant Feature Transform (SIFT), KAZE, Accelerated-KAZE (AKAZE), Binary Robust Invariant Scalable Keypoints (BRISK), and Histogram of Oriented Gradients (HOG). Dimension reduction was accomplished using linear discriminant analysis (LDA), and valence and arousal were classified using the Support Vector Machine (SVM). Results: The experimental results indicated that numerical data Open Peer Review Reviewer Status AWAITING PEER REVIEW Any reports and responses or comments on the article can be found at the end of the article. Page 1 of 10 F1000Research 2021, 10:1114 Last updated: 04 NOV 2021