Medical Technologies Journal, Volume: 3, Issue: 1, January-March 2019, Pages: 316-333. Doi :
https://doi.org/10.26415/2572-004X-vol3iss1p316-333
316
On Assisted Living of Paralyzed Persons
through Real-Time Eye Features Tracking and
Classification using Support Vector Machines
Type of article: Original
Qurban A Memon
Associate Professor
EE department, College of Engineering, UAE University, 15551, Al-Ain
Abstract
Background: The eye features like eye-blink and eyeball movements can be used as a module in
assisted living systems that allow a class of physically challenged people speaks – using their eyes.
The objective of this work is to design a real-time customized keyboard to be used by a physically
challenged person to speak to the outside world, for example, to enable a computer to read a story
or a document, do gaming and exercise of nerves, etc., through eye features tracking
Method: In a paralyzed person environment, the right-left, up-down eyeball movements act like a
scroll and eye blink as a nod. The eye features are tracked using Support Vector Machines (SVMs).
Results: A prototype keyboard is custom-designed to work with eye-blink detection and eyeball-
movement tracking using Support Vector Machines (SVMs) and tested in a typical paralyzed
person-environment under varied lighting conditions. Tests performed on male and female subjects
of different ages showed results with a success rate of 92%.
Conclusions: Since the system needs about 2 seconds to process one command, real-time use is not
required. The efficiency can be improved through the use of a depth sensor camera, faster processor
environment, or motion estimation.
Keywords: Assisted living; Rehabilitation; Paralyzed persons; Eye-blink detection; Eyeball
detection; Biomedical engineering; SVM; Machine learning; Image processing.
Corresponding author: Qurban A Memon, EE department, College of Engineering, UAE University, 15551, Al-Ain
qurban.memon@uaeu.ac.ae
Received: 26 January, 2019, Accepted: 28 Mars, 2019, English editing: 04 Mars, 2019,Published: 01 April, 2019.
Screened by iThenticate..©2017-2019 KNOWLEDGE KINGDOM PUBLISHING.
1. Introduction
Human feature detection and tracking are gaining more importance each day due to
a wide variety of applications that can be built. One application is constructing
interactive ways to communicate with Internet-enabled devices linked to people
with disabilities [1]. Commuting and communication are the main issues of these
patients. One such class of people with Tetra/quadriplegia face even
communication difficulties. Another class has rehabilitative disabilities (spinal cord
injury, repetitive strain injury, etc.) and motor disabilities (autism, cerebral palsy,
Lou Gehrig's, and so forth). Historically, techniques like Partner-Assisted Scanning
(PAS) have been used to help these people communicate. In this technique, the
nurse/caregiver presents a set of symbols (e.g., words, alphabets, pictures, letters)
on a screen to the disabled patient, observes the patient’s eye on the screen, and
then determines selection from among those symbols to express needs.
Augmentative and Alternative Communication (AAC) is a very general term
and is diversified into two types; aided and unaided systems [2]. In aided