The Hapticon Editor: A Tool in Support of Haptic Communication Research Mario J. Enriquez and Karon E. MacLean Department of Computer Science University of British Columbia enriquez@cs.ubc.ca, maclean@cs.ubc.ca Abstract We define haptic icons, or “hapticons”, as brief programmed forces applied to a user through a haptic interface, with the role of communicating a simple idea in manner similar to visual or auditory icons. In this paper we present the design and implementation of an innovative software tool and graphical interface for the creation and editing of hapticons. The tool’s features include various methods for creating new icons including direct recording of manual trajectories and creation from a choice of basis waveforms; novel direct-manipulation icon editing mechanisms, integrated playback and convenient storage of icons to file. We discuss some ways in which the tool has aided our research in the area of haptic iconography and present an innovative approach for generating and rendering simple textures on a low degree of freedom haptic device using what we call terrain display 1. Introduction Visual and auditory icons have long been integral to computer interfaces, as a means of indicating functionality, location and other low-dimensional information more efficiently than can displayed text [1,2]. Graphic icons, for example, are small and concise graphic representations of real or abstract objects. These icons should be easily identifiable by the user and can represent a spectrum of information, ranging from specific functions to abstract controls. In everyday interaction with manual controls such as those found in a car, on a workbench or throughout a building, we use parameters such as shape, texture and muscle memory to identify and locate different functions and states of handles ranging from doorknobs to pencils and radio controls. With the introduction of “active” haptic interfaces, a single handle - e.g. a knob or a joystick - can control several different and perhaps unrelated functions. These multi-function controllers can no longer be differentiated from one another by position, shape or texture differences, and it becomes a design challenge to make both the existence of available functions and their identity apparent to the user. Active haptic icons, or “hapticons”, may be able to solve this problem by rendering haptically distinct and meaningful sensations for the different functions. A systematic approach to hapticon design requires tools that allow people without engineering background closer participation in the creative process, thus broadening and enriching the area. The Hapticon Editor, with its simple, efficient approach, is such a tool. 2. Related Work 2.1. Icon Design There has been a great deal of work relating to the design of auditory and visual icons. The auditory and haptic iconic design space share many key attributes: they are both temporally sequential, while human perception has narrow limits for amplitude and period discrimination. Thus in our hapticon research program, we have found it most productive to follow auditory icon design. There have been two principal approaches to using sound to iconify information: Gaver et al. [3,4] studied “Auditory Icons”. These are essentially representations of the objects or notions that embody a literal, direct meaning: for example, using the sound of a paper being crushed to indicate deleting a computer file. Most of us are familiar with both the sound of crumpling paper and the action of deleting a file, and can easily make the association. While it is an intuitive approach, it does not address whether users can differentiate the icons or how many icons can be distinguished. Brewster et al. [5,6] took a different approach. “Earcons” are sounds and rhythms with no innate meaning: their target or meaning must be learned. Brewster’s studies have focused on understanding and quantifying the different “Earcons” that can be perceptually differentiated by users, what sounds are most perceptually salient and whether certain sounds are appropriate for a given application. For example, a quiet sound might literally represent a very urgent or dangerous event because that event does not generate much sound in the real world. However, in a different