1 STOP MAKING SENSE: DESIGNING SENSOR-BASED INTERACTIONS TO FACILITATE EXPLORATION AND REFLECTION YVONNE ROGERS and HENK MULLER University of Sussex 1 and University of Bristol Current sensing-based interactions have a number of inherent usability problems. The coupling between users’ actions and system feedback can be unclear, resulting in the users not knowing what causes a system response or how to make something happen. One reason why such confusion arises is that sensor technology is often unfairly compared with graphical user interfaces, where highly constrained and precise user-interactions (e.g. a key press) are replaced with much less constrained and imprecise user-interactions. This is not a good use of sensor technology. An alternative approach is to consider activities for which sensor-based interactions are more suited, and where inherent uncertainty can be put to good use. These include aesthetic and creative applications, involving reflection, exploration and discovery. To this end, we present a conceptual framework intended to inform and inspire the design of sensor-based interactions. We illustrate its applicability by describing a series of novel sensor-based interactions that we designed to instill a playful learning experience in young children. Categories and Subject Descriptors: B.4.2 [Input/Output Devices]; H.1.2 [Models and Principles]: User/Machine Systems-Human Factors; H.1.[Models and Principles]: Systems and Information Theory-General Systems Theory H.5.2 [User Interfaces] General Terms: Design, Human Factors Additional Key Words and Phrases: Sensor-based Interaction, Exploration, Conceptual framework, User Experience 1. INTRODUCTION Sensor technologies, such as motion, pressure, location and proximity sensors are increasingly being used as input devices to computer systems. However, many of these are proving to be a less than optimal way of interacting. Part of the reason is that sensor technologies have been used to replace user interfaces of existing applications. A mundane example is the replacement of physical switches and handles, that for centuries have been used to support everyday tasks, like turning lights or taps on or off. Motivated by recent trends towards improving hygiene and energy saving, highly visible physical controls are increasingly being replaced in public places by invisible sensors, removing the need for human hands ever to have to turn things on or off. Within the research community, sensor technology has largely been used to investigate how ‘context-aware’ information can be given to people on various kinds of mobile devices; delivering assumed-to-be relevant information to the user, at assumed-to-be appropriate times (e.g. Shilit et al, 1994; Marmasse and Schmandt, 2000). Again, a main benefit of doing this is to create a lightweight ‘hands-free’ user- interaction, replacing the cognitive overhead typically associated with having to type in or select options from menus to find desired information. Instead, certain forms of information are designed to pop up at certain times on a computer screen (usually a PDA or presented via audio), based on the user’s location in an environment. A main motivation behind this kind of sensor-driven research is the desire to enable the “tighter integration of information and perception...allowing for more natural, 1 Contact author: Interact Lab, School of Cognitive and Computing Sciences, University of Sussex, Brighton, BN1 9QH, UK. Email: yvonner@cogs.susx.ac.uk