1 Mixed Interaction Space – Designing for Camera Based Interaction with Mobile Devices Thomas Riisgaard Hansen Center for Pervasive Healthcare Department of Computer Science, University of Aarhus Aabogade 34, DK8200 Aarhus N, Denmark thomasr@daimi.au.dk Eva Eriksson Interactive Spaces, ISIS Katrinebjerg Department of Computer Science, University of Aarhus Aabogade 34, DK8200 Aarhus N, Denmark evae@interactivespaces.net Andreas Lykke-Olesen Department of Design Aarhus School of Architecture, Nørreport 20, DK8000 Aarhus C, Denmark alo@interactivespaces.net +4589360000 ABSTRACT In mobile devices, such as mobile phones and PDAs, an integrated camera can be used to interact with the device in new ways. In this paper we introduce the term mixed interaction space and argue that the possibility of using mixed interaction spaces is what distinguishes camera- based interaction from other types of sensor-based interaction on mobile devices. We present our implemented applications, and related work that use mixed interaction spaces. Based on this we address how mixed interaction spaces can have different identities, be mapped to applications, and how it can be visualized. Author Keywords Input and interaction technologies, augmented reality, tangible UI, interaction design. ACM Classification Keywords H5.2. User Interfaces (Interaction styles, Haptic I/O, GUI) INTRODUCTION An increasing amount of today’s mobile devices are equipped with integrated cameras, which can be used to determine how devices are manipulated. By applying image analysis algorithms on the camera pictures, the movement, and in some cases the rotation and tilting, can be determined. This input technology has been used to implement a set of different applications by e.g. SpotCode [13], SemaCode [12] and Rohs [10]. To a great extent the focus in these projects is on the technology itself whereas the interaction technique is not discussed or analyzed. In this paper we introduce the term mixed interaction space and argue that the possibility of using the position in space distinguishes interaction techniques based on the integrated camera from other interaction techniques that e.g. use accelerometers or compasses as data input sensors. We present our own work with mixed interaction spaces and argue how it relates to other projects that use similar techniques. We then discuss mixed interaction spaces in detail and point out three important characteristics; identity, mapping and visualization. MIXED INTERACTION SPACE There exist several novel interaction techniques for mobile devices to supplement button and pen interaction. Speech is an obvious candidate; the speech recognition systems available on mobile devices can efficiently be used to select a specific command e.g. calling a specific number. Voice commands, even for simple navigation introduce cognitive overhead [4], which can be a problem. Accelerometers, sometimes combined with a compass, can interact with an application by using tilting, rotation and movement of the device as input. The clear advantage of this interaction technique is its independence of the surroundings why it supports mobility very well. It supports new ways of interacting with applications e.g. scrolling in applications by tilting the device [6]. Interaction techniques that use integrated cameras strongly resemble interactions that can be designed with accelerometers. The movement, rotation and tilting of the device, can partly be extracted from running optical flow algorithms on the camera images. However, the camera images can provide more information than the movement, tilting or rotation vector. It can be used to identify a feature, or fixed point, and it can calculate its relative rotation, tilting and position according to this point. A space is spanned from this fixed point to the end of the camera view (see Figure 1), and this space is what we call the mixed interaction space. With mixed we try to emphasize that the space is a physical space, but at the same time the space plays an important role in the digital interaction that is controlled by the movement of the device in the space. Copyright is held by the author/owner(s). CHI 2005, April 2–7, 2005, Portland, Oregon, USA. ACM 1-59593-002-7/05/0004.