This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2020.3012900, IEEE Access Date of publication xxxx 00, 0000, date of current version xxxx 00, 0000. Digital Object Identifier 10.1109/ACCESS.2020.DOI BioCNN: A Hardware Inference Engine for EEG-based Emotion Detection HECTOR A. GONZALEZ 1 (Student Member, IEEE), SHAHZAD MUZAFFAR 2 (Student Member, IEEE), JERALD YOO 3,4 (Senior Member, IEEE), AND IBRAHIM (ABE) M. ELFADEL 2,5 (Senior Member, IEEE) 1 Chair of Highly-Parallel VLSI-Systems and Neuro-Microelectronics, Technische Universität Dresden, Germany (email: hector.gonzalez@tu-dresden.de) 2 Department of Electrical Engineering and Computer Science, Khalifa University, Abu Dhabi, United Arab Emirates (e-mail: {shahzad.muzaffar, ibrahim.elfadel}@ku.ac.ae) 3 Department of Electrical and Computer Engineering, National University of Singapore, Singapore, (e-mail: jyoo@nus.edu.sg) 4 Singapore Institute for Neurotechnology (SINAPSE), Singapore 5 Center for Cyber Physical Systems, Khalifa University, Abu Dhabi, United Arab Emirates Corresponding author: Ibrahim (Abe) M. Elfadel (e-mail: ibrahim.elfadel@ku.ac.ae). The experimental procedures involving human subjects described in this paper have been approved by the Research Ethics Committee at Khalifa University under Protocol #H17-003. ABSTRACT EEG-based emotion classifiers have the potential of significantly improving the social integration of patients suffering from neurological disorders such as Amyotrophic Lateral Sclerosis or the acute stages of Alzheimer’s disease. Emotion classifiers have historically used software on general-purpose computers and operating under off-line conditions. Yet the wearability of such classifiers is a must if they are to enable the socialization of critical-care patients. Such wearability requires the use of low-power hardware accelerators that would enable near real-time classification and extended periods of operations. In this paper, we architect, design, implement, and test a handcrafted, hardware Convolutional Neural Network, named BioCNN, optimized for EEG-based emotion detection and other bio-medical applications. The EEG signals are generated using a low-cost, off-the-shelf device, namely, Emotiv Epoc+, and then denoised and pre-processed ahead of their use by BioCNN. For training and testing, BioCNN uses three repositories of emotion classification datasets, including the publicly available DEAP and DREAMER datasets, along with an original dataset collected in-house from 5 healthy subjects using standard visual stimuli. A subject- specific training approach is used under TensorFlow to train BioCNN, which is implemented using the Digilent Atlys Board with a low-cost Spartan-6 FPGA. The experimental results show a competitive energy efficiency of 11 GOps/W , a throughput of 1.65 GOps that is in line with the real-time specification of a wearable device, and a latency of less than 1 ms, which is smaller than the 150 ms required for human interaction times. Its emotion inference accuracy is competitive with the top software-based emotion detectors. INDEX TERMS Emotion Recognition; EEG; FPGA; Machine Learning; Hardware Accelerator; Edge AI; Convolutional Neural Networks; Hardware Parallelism; Pipelining. I. INTRODUCTION P ATIENTS suffering from Amyotrophic Lateral Sclerosis (ALS) or the late stages of Alzheimer’s disease are in a locked-in emotional state that prevents them from using their facial features to express emotions. On the other hand, the brain EEG signals of such patients are not impacted by such state and continue to contain the information needed to detect emotional content. The main goal of this paper is to prove the feasibility of a wearable, small footprint, EEG- based machine-learning device to help these patients commu- nicate their emotions in real time to their social environment, particularly their families and care providers. Such device is built around a low-power, FPGA-implemented emotion classifier with competitive classification accuracies for both the valence and arousal of the classified emotion. Now multiple efforts have been made to design EEG-based software classifiers for emotions ranging from shallow [1] to deep models, the latter including hybrid combinations of convolutional neural networks (CNN) for extracting EEG features and recurrent neural networks (RNN) for analysing VOLUME XX, 2020 1