D. Uhler, K. Mehta, and J.L. Wong (Eds.): MobiCase 2012, LNICST 110, pp. 100–114, 2013.
© Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2013
Towards Multimodal 3D Tabletop Interaction
Using Sensor Equipped Mobile Devices
Florian Klompmaker
1
, Karsten Nebe
2
, and Julien Eschenlohr
1
1
University of Paderborn, C-LAB,
Fürstenallee 11, 33647 Paderborn, Germany
florian.klompmaker@c-lab.de, eschenlohr@googlemail.com
2
Rhine-Waal University of Applied Sciences,
Südstraße 8, 47475 Kamp-Lintfort, Germany
Karsten.Nebe@hochschule-rhein-waal.de
Abstract. Interactive tabletops have been proven to be very suitable setups for
collaborative work especially in combination with mobile devices. Further on,
many application scenarios require the visualization of 3D data. Therefore we
present multimodal 3D interaction techniques for tabletops that allow
simultaneous control of six degrees of freedom using sensor equipped mobile
devices. In two early user studies we compared multitouch, tangible interaction
and sensor equipped smartphones in order to start a User Centered Design
process. We got important results regarding effectiveness, intuitiveness and user
experience. Most notably we figured out that mobile devices equipped with
acceleration sensors are very suitable for 3D rotation tasks.
Keywords: Tabletop, Multitouch, 3D User Interface, Mobile 3D Interaction,
Smartphone, User Centered Design.
1 Introduction
Since the first Microsoft Surface appeared on the market in 2008, tabletop devices got
quite famous and a lot of effort has been spent in technical improvements and the
development of new interaction paradigms. It has been shown that these devices are
suitable for special use cases like rapid prototyping [1], architecture and city /
landscape planning [2,3], disaster management [4], collaborative document
management [5] as well as training and learning [6,7]. In general terms it has been
shown that tabletop devices in combination with Natural User Interfaces can
dramatically enhance collaborative scenarios. One reason for this is the size of the
display. It allows every participant in a collaborative setting to see WHO is currently
doing WHAT. This so-called awareness of others [8,9] can’t be found in classical
Computer Supported Cooperative Work scenarios if multiple private devices are used
instead of a large shared display. When developing collaborative applications for
tabletop devices designers and programmers have to consider several aspects like the
orientation of graphical objects [10], private and public spaces [11], security issues
and group dynamics [12]. We have shown that User Centered Design (UCD) is a