IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 61, NO. 11, NOVEMBER 2012 3103
Haptic, Audio, and Visual: Multimodal
Distribution for Interactive Games
Marco Gaudina, Student Member, IEEE, Victor Zappi, Student Member, IEEE,
Andrea Brogni, and Darwin G. Caldwell, Senior Member, IEEE
Abstract—Since technology started to be distributed on a large
scale, gaming experience has radically changed. After years of
graphical-detail challenges, the attention has been shifted on how
users can interact with games. Our paper follows this direction,
aiming to develop a unifying framework to distribute user inter-
actions over different platforms. The main idea is to make the
hardware a user is playing with transparent to other players.
Different platforms could run the same game, while the inputs
from one are translated to output for another according to its own
hardware capabilities. In addition, we present a conceptual model
where the types of interaction between users are subordinated not
only to the available devices but primarily to their main purpose
for the different actions during the game. The architecture was ini-
tially applied to a basic cross-platform multiplayer musical game.
We then implemented the conceptual model to a more complex
multiplayer application to evaluate with experimental data the
interaction distribution concept. The result was a shared game
experience, where players perceived the competitors’ presence but
not the strong differences in the hardware equipment.
Index Terms—Games, haptic, interaction, multimodality.
I. I NTRODUCTION
I
N THE last 15 years, gaming technologies evolved in differ-
ent aspects. After years of “gamepad” interaction in single-
player mode or with friends with different local split screen,
the wide spread of Internet and multiplayer modality over the
network radically changed the way of developing games. New
possibilities were open for both users and developers, leading
to a completely new social communication [1].
The game controller remained almost the same for every
game platform for a long time, but in the last decade, gaming
consoles have reached such an impressive graphic detail and
realism that companies have concentrated their attention on
creating online gaming services. This changes the technique
where the player interacts with the game itself via the controller.
Nintendo, with the Nintendo Wii [2], introduced a completely
different gaming philosophy, putting the user at the center of the
game scene where the user comes back to physically act during
the game. There are no more joystick and graphical avatars on
Manuscript received November 19, 2010; revised February 21, 2012;
accepted April 2, 2012. Date of publication July 17, 2012; date of current
version October 10, 2012. The Associate Editor coordinating the review process
for this paper was Dr. Atif Alamri.
The authors are with the Department of Advanced Robotics, Istituto
Italiano di Tecnologia, 16163 Genova, Italy (e-mail: marco.gaudina@iit.it;
victor.zappi@iit.it; andrea.brogni@iit.it; darwin.caldwell@iit.it).
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TIM.2012.2202071
the screen, but your own body playing tennis, waving the arm,
or punching the air during the boxing game. The Wii allows
the user to perceive again a more natural interaction with the
gaming environment. Moreover, the expansion of this technol-
ogy drastically dropped down the prices for many systems and
devices that were, only a few years ago, a privilege for few. This
opened the opportunity to have different and complex systems
dedicated to games.
Overall, we believe that we can design gaming experience
over different platforms, distributing the user’s interactions.
To achieve this, the system needs to be independent from
hardware and software and to be focused on the objective of
the interaction itself, defined by the application. We present
a conceptual model to define the architecture needed to in-
terface different platforms with different possible interactions.
We have introduced the concept of transparency as the hidden
transformation of signals from a system to another. That not
only converts the actions of a user into actions in the game
but also feedbacks for another user, modified according to
his/her hardware. The player can exploit his/her platform at
the maximum level without thinking about the other players’
platforms or what level of interaction they can achieve.
We implemented this interaction framework in a distributed
application, allowing differently equipped platforms to inter-
face and interact altogether in a multiuser musical game, de-
scribed in Section IV. Different users have tested the system,
and their feedbacks suggested a deeper analysis and a measure
of the interactions between users.
Therefore, we have carried out an experiment with a simple
multimodal interactive game, described in Section V. In the
game, the interactions were adapted, transformed, and scaled
in accordance to the platform possibilities, involving haptic,
visual, touch, and vibrations. Measuring users’ interaction and
registering their behavior, we could have a better view on how
the whole framework works.
II. RELATED WORKS
Since the late 1990s, researchers exploited the need of emerg-
ing unifying hardware infrastructures to recreate multimodal
interfaces for distributed applications [3]. These kinds of sys-
tems are agent driven and based on modularity, distribution,
and asynchrony, to regulate interapplication process, using
technologies that can adapt to the device that they are running
on. Recently, in [4], a framework for the rapid prototyping of
multimodal interfaces has been proposed, and the advantages to
unify all the available resources are outlined. Musical games or
0018-9456/$31.00 © 2012 IEEE