Paper—Realtime Online Daily Living Activity Recognition Using Head-Mounted Display Realtime Online Daily Living Activity Recognition Using Head-Mounted Display https://doi.org/10.3991/ijim.v11i3.6469 Fais Al Huda Brawijaya University, Malang, Indonesia fais.developer@gmail.com Herman Tolle Brawijaya University, Malang, Indonesia emang@ub.ac.id Rosa Andrie Asmara State Polytechnics of Malang, Indonesia rosa_andrie@polinema.ac.id Abstract—Human activity recognition is one of the popular research fields. The results of this study can be applied to many other fields such as the mili- tary, commercialism, and health. With the advent of the wearable head mounted display device mainly like google glass raises the possibility of this research. In this study tries to identify everyday activities are often called the ambient ac- tivity. Development of the system is done online using a smartphone and a head mounted display. The system produces an accuracy above 90%, which can be concluded that the system was able to recognize the activities with great accu- racy. Keywords—Android, head mounted display, accelerometer, sensor. 1 Introduction Research on human activity recognition (HAR) is one of the research areas that are popular recently. Due to the results of this study can be applied to many fields such as military, commercial, and especially the health sector. One of the benefits of the ac- tivities recognition in the health field can be used to predict the falling potential of the user [1]. Other studies of HAR is HEMOCS which stands head movement controller system [2]. In this research the activities were recognized is the head movement of the user, the purpose of this study is to provide an alternative for users, especially users who have physical limitations and the elderly when interacting with a computer. Re- search on HAR also is used to recognize a person's gait that can be used to determine the gender of a user [3]. With so many benefits of HAR resulting in need of research in this field to be multiplied. iJIM ‒ Vol. 11, No. 3, 2017 67