FingerSense - Augmenting Expressiveness to Physical Pushing Button by Fingertip Identification Jingtao Wang, John Canny Computer Science Division University of California at Berkeley, Berkeley, CA 94720-1776, USA {jingtaow, jfc}@cs.berkeley.edu ABSTRACT In this paper, we propose a novel method, FingerSense to enhance the expressiveness of physical buttons. In a FingerSense enabled input device, a pressing action is differentiated according to the finger involved. We modeled the human performance of FingerSense interfaces and derived related parameters from a preliminary usability study. Overall findings indicate that FingerSense is faster compared with traditional keypads when the finger switching action could be paralleled. Categories & Subject Descriptors: H5.2 [Information interfaces and presentation]: Input devices and strategies, Theory and methods General Terms: Design, Human Factors. Keywords: Text input, input device, mobile computing, fingerprint recognition, performance modeling. INTRODUCTION Tapping physical buttons is one of the most frequent tasks in computer-human interaction. In a button-based input device, e.g. the QWERTY keyboard, 1/2/3-button mouse or the telephone keypad, the user’s fingers act as triggers for executing commands. Although alternative input modalities such as speech and handwriting are available, button-based interfaces, especially the keyboard, are still the most widely used input device. The emergence of handheld, cell phone and other forms of mobile computing devices, however, present unique challenges to traditional button interfaces - due to the size of human fingers and the corresponding motor control accuracy, buttons can not be made too small. It becomes increasingly difficult for a full QWERTY keyboard to fit into the ever smaller mobile devices. In this paper, we propose an alternative method, FingerSense, to improve the expressiveness of pushing buttons without the cost of minimizing the button size or adding additional key strokes 1 . In a FingerSense enabled 1 Here additional keystrokes also mean pressing multiple buttons at the same time. Copyright is held by the author/owner(s). CHI 2004, April 24–29, 2004, Vienna, Austria. ACM 1-58113-703-6/04/0004. input device, a button will respond differently when it is pressed by different fingers. As illustrated in figure 1, when the thumb finger taps the given button, the action can be interpreted as event A. If index finger is used, the system will interpret this action as event B, similarly the middle finger will correspond to event C, etc. As a result, a single pressing action could generate as many events as the number of user’s fingers. We define FingerSense as the method of multiplexing a physical button according to the actual finger selected in taping, despite the underlining sensing/recognition technology used to distinguish fingers. Figure 1. From classic buttons to FingerSense button To verify the effectiveness of FingerSense, we investigate the follow three questions in this paper: 1) Is FingerSense technologically feasible? i.e. is it possible to classify the finger tapping at a button in real time and in a cost effective manner? 2) To use FingerSense, the user must select and switch to the correct finger before tapping the intended button; is this procedure a cognitive workload too high to be adopted by most of the users? 3) Is there any speed advantage for the FingerSense enabled text input when it is compared with the state-of-the-art? In the next section, we give a survey of projects and sensing technology related with FingerSense, and then we describe the implementation of a computer-vision based prototype, which aims to demonstrate the feasibility of FingerSense. In the follow-on section, we present a theoretical model of FingerSense and quantitatively calculate the parameters in this model through a preliminary usability study. RELATED WORKS The key idea behind FingerSense is to detect and use the information implicitly encoded in specific fingers. To acquire and use such “information at your fingertips”, many CHI 2004 ׀Late Breaking Results Paper 24-29 April ׀Vienna, Austria 1267