F. Alvarez and C. Costa (Eds.): UCMEDIA 2010, LNICST 60, pp. 10–19, 2012.
© Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2012
A Multi-touch Solution to Build Personalized Interfaces
for the Control of Remote Applications
Gianluca Paravati
1
, Mattia Donna Bianco
2
, Andrea Sanna
1
, and Fabrizio Lamberti
1
1
Politecnico di Torino, Dipartimento di Automatica e Informatica,
C.so Duca degli Abruzzi 24, I-10129, Torino, Italy
2
CEDEO.net, Via Borgionera 103, I-10040, Villar Dora (TO), Italy
{gianluca.paravati,andrea.sanna,fabrizio.lamberti}@polito.it,
mattia@cedeo.net
Abstract. This paper presents a framework for controlling remote applications
by means of personalized multi-touch interfaces. The designed framework
allows end-users to fully personalize the mapping between gestures and input
commands. A two-tier architecture has been developed. A formal description of
the original interface is automatically generated at the server side to identify a
set of available actions for controlling existing applications. The client is in
charge of loading the description of the target application, allowing the user to
shape the preferred mapping between gestures and actions. Finally, the server
converts the identified actions into one or more commands understandable by
the original computer interface. The implementation of the system for this work
specifically relies on handheld multi-touch devices. Test results are
encouraging, both from an objective and a subjective point of view; indeed, the
designed framework resulted to outperform a traditional GUI both in terms of
number of actions to perform a task and average completion time.
Keywords: Multi-Touch, personalized interfaces, human-machine interface,
remote control.
1 Introduction
Human-machine interaction based on touch devices is quite common today. The
evolution of input device technologies such as reflection-based or pressure-sensitive
touch surfaces led to identification of the natural user interface (NUI) as the clear
evolution of the human-machine interaction, following the shift from command-line
interfaces (CLI) to graphical user interfaces (GUI).
The main goal of human-machine interaction is to improve the way users and
computers communicate, by means of effective user interfaces. The design of user
interfaces requires a careful mapping of complex user actions in order to make
computers more intuitive, usable and receptive to the user’s needs: in other words,
more user-friendly.
Gestures, and in particular hand gestures, ever played a crucial role in human
communication, as they constitute a direct expression of mental concepts [1]. The