Orientation sensing for gesture-based interaction with smart artifacts Alois Ferscha * , Stefan Resmerita, Clemens Holzmann, Martin Reicho ¨r Department of Pervasive Computing, Johannes Kepler University of Linz, Linz, Austria Received 31 August 2004; accepted 15 December 2004 Available online 11 May 2005 Abstract Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as ‘smart artifacts’), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or ‘by-hand’) manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as ‘gestures’ in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software. q 2005 Elsevier B.V. All rights reserved. Keywords: Inertial sensors; Orientation tracking; Embodied interaction; Gesture recognition; Tangible user interface 1. Introduction Embodied interaction [21] aims at facilitating remote control applications by providing natural and intuitive means of interaction, which are often more efficient and powerful compared with traditional interaction methods [25]. It is related to Tangible User Interfaces (TUIs), which were introduced by Ishii and Ullmer in [7]. TUIs couple physical representations (e.g. spatially manipulable physical artifact) with digital representation (e.g. graphics and sounds), making bits directly manipulable and perceptible by people. In general, tangible interfaces are related to the use of physical artifact as representations and controls for digital information. Since the physical state of an artifact is usually changed through human manipulation, the position and orientation of the artifact are ideal candidates for enabling gestural interaction. In particular, orientation can be seen as input to the digital world model of an artifact, as depicted in Fig. 1. In this context, the present paper addresses the issues of gesture-based interaction for remote control, and the use of orientation sensors for gesture detection and recognition. The main objective is to speed-up practical realization of intuitive gestural interaction by providing application developers and sensor manufacturers with common speci- fications of information structures and operational require- ments that will enable the use of various (different) types of sensors and artifact to control, by gestures, a wide spectrum of applications. The focus on inertial orientation sensors is motivated, on one hand, by the availability of a large array of sensors from different manufacturers. On the other hand, apparently the use of orientation for gestures has received far less attention from the research community than the use of position. For gesture sensing and recognition, the orientation is commonly encountered as an additional information to position. However, orientation sensing has advantages versus position sensing that are important in various circumstances, which are discussed in this paper. Computer Communications 28 (2005) 1552–1563 www.elsevier.com/locate/comcom 0140-3664/$ - see front matter q 2005 Elsevier B.V. All rights reserved. doi:10.1016/j.comcom.2004.12.046 * Corresponding author. Tel.: C43 732 2468 8555; fax: C43 732 2468 8426. E-mail addresses: ferscha@soft.uni-linz.ac.at (A. Ferscha), resmerita @soft.uni-linz.ac.at (S. Resmerita), holzmann@soft.uni-linz.ac.at (C. Holzmann).