Human Activity Recognition on Time Series Accelerometer Sensor Data using LSTM Recurrent Neural Networks Chrisogonas O. Odhiambo Computer Science & Engineering University of South Carolina Columbia, SC, USA odhiambo@email.sc.edu Sanjoy Saha Pennington Biomedical Research Center, Louisiana State University, Baton Rouge, LA, USA sanjoy.saha@pbrc.edu Corby K. Martin Pennington Biomedical Research Center, Louisiana State University, Baton Rouge, LA, USA Corby.Martin@pbrc.edu Homayoun Valafar Computer Science & Engineering University of South Carolina Columbia, SC, USA homayoun@cse.sc.edu Abstract The use of sensors available through smart devices has pervaded everyday life in several applications including human activity monitoring, healthcare, and social networks. In this study, we focus on the use of smartwatch accelerometer sensors to recognize eating activity. More specifically, we collected sensor data from 10 participants while consuming pizza. Using this information, and other comparable data available for similar events such as smoking and medication-taking, and dissimilar activities of jogging, we developed a LSTM-ANN architecture that has demonstrated 90% success in identifying individual bites compared to a puff, medication-taking or jogging activities. Keywords: Smartwatch, Accelerometer, Sensors, Artificial Intelligence, Machine Learning, LSTM, Human Activity Recognition, Eating, Bite, Food Intake I. INTRODUCTION Accurately assessing health behaviors in humans is necessary to evaluate health risk and effectively intervene to facilitate behavior change, improve health, and reduce disease risk. Health behaviors, such as eating, smoking, exercise (e.g., jogging), and medication-taking are frequently assessed with subjective self-report methods, such as diaries, which participants complete throughout the day. The accuracy of self-report methods is poor, however, particularly for assessing food intake and physical activity [1]. Self-report methods are also burdensome for participants, particularly if health behaviors need to be assessed over the long term [2]. Mobile health technology, including sensors worn on the body, can be used to passively and remotely collect and transmit objective data. These objective data can be much more valid and reliable compared to self-report, particularly for exercises such as walking [3]. The passive collection and transmission of data to researchers or clinicians have other advantages, including a dramatic reduction in participant burden and the ability to process and provide feedback to participants automatically and in real-time or near real-time. This critical step provides a platform to develop and deliver ecological momentary interventions (EMI) [4] and just-in-time adaptive interventions (JITAI) [5]. EMI and JITAIs deliver intervention strategies that are customized to address the specific needs of individual participants as soon as these needs are detected. Participant needs are identified by evaluation of the objective data from the remote sensors in real-time or near real-time. Indeed, EMI and JITAI can provide more automated and cost-effective approaches to intervene and improve health behavior remotely while maintaining efficacy [6][8]. Here we report a novel application of Artificial Neural Networks to, objectively and automatically, identify and discriminate eating activity from three other activities namely smoking, medication-taking, and jogging using accelerometer data acquired from a smartwatch. Validation of the algorithm would make it possible to develop and deploy novel EMI and JITAI to improve these four health behaviors. Machine Learning algorithms have been used to achieve great results in developing practical solutions in multiple domains: health diagnosis [9][11], sports [12], [13], human activity recognition [14][16], among many others. II. BACKGROUND AND METHOD A. Previous and Related Work Considering their rich array of sensors, the cost, accessibility, and ease of use, smartwatches have emerged as a compelling platform to study human activities unobtrusively. Smartwatches have been used as step- counters[17], sleep monitoring[18], diet monitoring[19] as well as general fitness tracking[20]. In the context of smoking, smartwatches have been demonstrated to be usable for in-situ study of smoking[21], [22] with high accuracy[21][24]. Smartwatches have been used to detect smoking gestures with 95% accuracy in a laboratory environment[25] and 90% in-situ detection of smoking[23]. The study of smoking has also been demonstrated to be more accurate when compared to self- report (90% versus 78%)[23], [26]. Wearable devices have been utilized in indirect observation of some activities with clear health implications, such as medication adherence in numerous ways, including (1) self-report the behavior via mobile devices[27], (2) sensors worn around the neck e.g. the SenseCam[28] was originally envisaged for use within the domain of Human Digital Memory to create a personal lifelog or visual recording of the wearer's life, which can