© 2015 Hussam Saad Adeen, Ayman Atia, Ahmad Amin, Andrew Victor, Abdelrahman Essam, Ehab Gharib and Mohamed
Hussien. This open access article is distributed under a Creative Commons Attribution (CC-BY) 3.0 license.
Journal of Computer Sciences
Original Research Paper
RemoAct: Portable Projected Interface with Hand Gesture
Interaction
Hussam Saad Adeen, Ayman Atia, Ahmad Amin, Andrew Victor, Abdelrahman Essam,
Ehab Gharib and Mohamed Hussien
Human Computer Interaction Lab., Computer Science,
Faculty of Computers and Information, Helwan University, Cairo, Egypt
Article history
Received: 05-11-2014
Revised: 29-06-2015
Accepted: 24-07-2015
Corresponding Author:
Ayman Atia
Human Computer Interaction
Lab., Computer Science,
Faculty of Computers and
Information, Helwan
University, Cairo, Egypt
Email: ayman@fci.helwan.edu.eg
Abstract: RemoAct is a wearable depth sensing and projection system that
enables interaction on many surfaces. It makes interaction with the
environment more intuitive through sharing and sending data with
surrounding devices by applying certain gestures. This system offers a
mobile and intuitive solution for interacting using a projected surface on
habitual flat surfaces. Every user has their public and private areas, where
the user can create tiles on the fly and share it with others and these public
tiles are shown to other users through augmented reality. Interaction is
made through hand gestures, finger tracking and hand tracking. This
gives the user more freedom in movement. Different experiments were
conducted to calculate the accuracy and RemoAct ran against different
conventional methods to compare its accuracy, time and user
experience. RemoAct takes less time for two users to draw one chart. As
the system enables the users to work simultaneously, it reduces the
needed time, short compared to successive drawing. For gesture
recognition, accuracy reached 90-95\%. Object recognition and face
identification accuracy varied with the variation of light.
Keywords: Hand Gesture, Finger Tracking, Face Identification, Object
Recognition, Gesture Recognition, Portable Interface
Introduction
Recent advancement in mobile touch displays has
made people familiar with new ways of dealing with
computational devices. However, dealing with a touch
display is now considered to be conventional. Further, it
is inconvenient to hold a device of a size of a hand palm
or to deal with a small device with tiny menus and icons.
Sharing views of visual data and images with others is
restricted by the mobile device small display.
Interacting, sharing and sending data with other people
has been restricted to using the device itself rather than a
sort of free-device environment. Recent research in Human
Computer Interaction (HCI) has opened a new vision for
dealing with computational devices so that the user does not
have to deal with the device itself. Rather, users can
transform everyday surfaces into interactive interfaces to
interact with their mobile devices. Sixth Sense uses markers
that are worn on the finger in order to interact with the
system (Mistry et al., 2009). New projection technologies
have introduced portable projectors used to project displays
on everyday surfaces commonly used. It can transfer the
handheld device display to almost any surface the user deal
with in an everyday environment. Portable projectors offers
great flexibility to project displays on walls, tabletops,
hand-held objects, or the user’s body. Thus offers a great
flexibility to control the size of display to fit different
environments and offers better sharing views of data and
images between multiple users. New ways of interaction
have been implemented opening the way to more intuitive
and easier to learn interaction techniques. The user can
interact with our everyday environment like walls, tabletops
and different held objects to be used as interfaces. However,
the adaptation of this new interaction technique implies
using particular hardware and sensing devices.
According to Vision-Based Hand-Gesture
Applications (Wachs et al., 2011), Hand gestures are
useful for computer interaction as they are the most
primary and expressive form of human interaction.
People communicate normally using hand gestures,
finger pointing and body movement. Hand gestures
interaction could yet become more important in different
applications as projected interface applications due to
their ease of access and naturalness of control.