Anais do V Congresso Brasileiro de Eletromiografia e Cinesiologia e X Simpósio de Engenharia Biomédica - ISBN: 978-85-5722-065-2 - DOI: 10.29327/cobecseb.78867 - 856 - TOWARDS AN ASSISTIVE INTERFACE TO COMMAND ROBOTIC WHEELCHAIRS AND INTERACT WITH ENVIRONMENT THROUGH EYE GAZE E. H. Montenegro-Couto*, K. A. Hernandez-Ossa*, A. C. Bissoli*, M. Sime**, T. F. Bastos-Filho* *Assistive Technology Group (NTA), Postgraduate Program in Electrical Engineering, **Assistive Technology Group (NTA), Postgraduate Program in Biotechnology, Federal University of Espirito Santo, Av. Fernando Ferrari 514, Vitoria, Brazil e-mails: eduardo.hmc1@gmail.br, teodiano.bastos@ufes.br Abstract: This work presents an intuitive and customizable assistive technology based on eye gaze, which is an integration of a previous multimodal assistive domotics system developed for the UFES robotic wheelchair. Users with motor disabilities are able to control home devices, communicate with family or caregiver through short phrases, and navigate a wheelchair by means of eye gaze. The interface is easy to use, in which there is a computer and a screen monitor on board the wheelchair, displaying some options for the user to select. This selection is made using an eye tracker. Experimental results with volunteers showed good performance in terms of the system usability. The main goal of this system is to improve life quality for users, providing augmented and alternative communication, mobility assistance and enhancement on activities of daily living. Keywords: Assistive Technology, Wheelchair, Domotics, Eye Gaze. Introduction There has been a major growth in the development and application of assistive technology for people with disabilities during the last decades. The main goal is to reduce their discomfort in a variety of tasks, mainly routine activities, by providing more self-sufficiency and reducing the dependency on external help [1]. For some severely disabled people, the main element to promote life improvement is an electric-powered wheelchair (EPW). According to the United States census, 3.6 million people currently use a wheelchair for everyday activities [2]. However, many individuals with motor- related diseases, or who suffered some accidents, as spinal cord injury, traumatic brain injury, multiple sclerosis, congenital problems, quadriplegia, or cerebral palsy have no possibility of using conventional manual controlled wheelchair [3], or even remote control or smartphones, in some cases, even voice control may be difficult. For severely disabled people, their intention can be recognized either invasively or non-invasively to command devices and promote life-improvement [4]. To command electric-powered wheelchairs, a wide variety of approaches have been proposed, among: joysticks [5], EEG [6], EMG [7], hybrid EEG/EMG [8], and even a multi-modal interface, with flexibility to choose different modalities for communication (eye blinks, eye movements, head movements, by blowing or sucking a straw, and through brain signals), depending on the user’s different level of disability [9]. Nonetheless, even with the advances on the state-of-the-art, assistive technologies struggle to become a handy tool. The signal acquisition equipment represents one of the main obstacles for a biosignals-based system in order to make their way out of the research labs. Wheelchair users usually spend a substantial part of their time at home. There is a wide interest in assistive systems that, while seated on a wheelchair, allows the user to operate several common home appliances. The combination of assistive technology and home automation is an active field of study, often called assistive domotics [10]. Thus, the design of a SMAD (System for Multimodal Assistive Domotics) addresses everyday problems, improving functionality and activities of daily living (ADL). Recent research of NTA/UFES developed this SMAD, in which the control is achieved by means of biological signals [11]. In particular, some improvements were made on the control system modality, and it is possible to control the SMAD through eye gaze [12]. On the other hand, a virtual reality system for EPW driving training purposes and testing of control interfaces, the SimCadRoM (Electric-Powered Wheelchair Simulator) was used to test and validate the prototype to be used on a real wheelchair. Studies about the impact on the quality of life of people with amyotrophic lateral sclerosis, a neurodegenerative disease, caused by the use of systems based on eye gaze, are presented in [13]. The current work presents the online control of a robotic wheelchair and the connected equipment through the SMAD’s SmartBox, using an eye gaze tracking system. The system is attached to the wheelchair, as well as a computer and a monitor screen, also on board, to allow the user to navigate through several screens and select a desired functionality to be performed. This research provides the users an easy and practical way to manage their activities of daily living.