Vision Guidance System for Visually Impaired with Augmented Reality
Kai Tjun,Fong
a
, Ka Fei, Thang
b
School of Engineering, Asia Pacific University of Technology & Innovation
Technology Park Malaysia, Bukit Jalil, 57000 Kuala Lumpur, Malaysia
fongkaitjun@hotmail.com
a
, ka.fei@apu.edu.my
b
ABSTRACT
Visually impaired person and the blinds (VIPB) are
having a low personal autonomy level in interacting
with surrounding objects. Vision Guidance System
(VGS) is proposed as the substitution of human
vision that allows VIPB to recognize the objects and
their positions on the tables in a room. Pattern-
matching algorithm with neural network and
position-locating algorithm are developed and
implemented on a simple prototype built with USB-
based webcam, laptop, earphone, hat and a bag. VGS
will inform users on the object detected and their
position through earphone. The experimental results
will be presented at the end of this paper to
demonstrate the performance of the algorithms
developed for VGS.
KEYWORDS
Image Processing, Shape-Based Recognition,
Pattern-Recognition, Position-Locating, Multilayer-
Perceptron Neural Network
1 INTRODUCTION
By living autonomously is one of the greatest
gift that humankind ever have which mean they
are given the freedom to govern their own
behaviors. However, the gift will be weakened
once the persons loses their sights. According to
the estimation from the Universal Eye Health
Program launched by World Health Union, there
were 285 million people who are visually
impaired worldwide back in 2010, and 39
million out of them were blind.
Hence, it has strengthen the fact that the assistive
technologies were and will become essential in
the future. In recent years, many researchers and
developers have putting more efforts in
developing self-navigating and obstacle-
detecting devices to assist VIPB by helping them
to live according to their wills. Nonetheless, the
problem VIPB encountered do not seem to be
ceased after various assistive technologies were
introduced. VIPB are incapable of seeing and
interacting surrounding objects in a room as they
wish which have made it even more challenging
for them to become contributing citizens or to be
more integrated into the society.
Hence, this paper proposed Vision Guidance
System (VGS), an assistive system with an
USB-based webcam mounted on a hat where it
is used to capture video frames of the
environment and fed into position-locating and
pattern-matching algorithms with neural
network. Pattern-matching with Multi-Layer
Perceptron Neural Network is being trained
based on the desire objects so it allows system to
recognize them accurately when similar objects
are presented to the system in the future. A
simple position-locating algorithm is capable of
identifying if the objects are on the user left,
right and front position. An instant pre-record
messages will be provided through earphone
informing the user about the detail of the object
allowing them to aware and interact with objects
within a room.
2 LITERATURE REVIEW
Vision Guidance system (VGS) comprise of
three stages which are information acquisition,
information transformation and information
conveying. The basic concept of VGS is
Proceedings of the International Conference on Electrical and Electronic Engineering Telecommunication Engineering, and Mechatronics, Kuala Lumpur, Malaysia, 2015
ISBN: 978-1-941968-18-5 ©2015 SDIWC 6