ComTouch: A Vibrotactile Emotional Communication Device Angela Chang Professor Sile O’ Modhrain MAS962 A Dialogue of the Senses Final Project Term Paper Keywords human interface, Human Computer Interaction, tangible user interfaces, tactile communication, fingerspelling, deaf-blind, mobility aids, blindness, paging, telecommunications, multimodal sensory perception BACKGROUND AND STRUCTURE OF THE PAPER This paper describes the research to date in the ComTouch project, a multimodal sensory communication system to provide an additional channel for emotional communication. The combination method for tactile and audio information is described. The aim is to build a device to send or receive tactile sensations remotely. This paper begins with a brief review of research on existing tactile languages, and features of such languages that are desired in a tactile remote communication device. Next, a description of the set of available tactile stimuli is also discussed. The combination of tactile stimuli with other sensory modalities is discussed. Fundamental design issues of what information to encode and how to present the information are discussed. Finally we discuss the possible experiments to assess the performance communication ability of the device. In this paper we describe the process so far in our development of a tactile remote communication device. Scenarios of use for a tactile remote communication are described in order to aid in visualizing the possible usage of the device. A description of the existing state of research is presented. Furthermore, issues of language and implementation of the device are discussed. Finally, experiments are proposed to aid in evaluation of the device. TACTILE LANGUAGES The term tactile refers to the sense of touch. Tactile perception is the ability to interpret and give meaning to sensory stimuli in response to tactile stimulation. Tactile language refers to communication methods that employ the sense of touch. Tactile language can be subdivided into two classes of languages, alphabetic and symbolic. The first class, alphabetic language uses the representation of the alphanumeric letters to form words. Examples are chording keyboards, Braille, Moon, and telegraphs. Conversely, symbolic language represents higher-level concepts that are not mapped directly to words, but rather, ideas and expressive emotions. Examples of symbolic language are facial expressions, hand gestures and body language for expressing interest and emotional state. Research has shown that the rate of transmission of alphanumeric letters is much slower than symbolic language, while accuracy in alphanumeric languages is much higher than in symbolic language (Reed, 1990). Symbolic and alphabetic language can be combined in a language. For example, Morse code is one example of such a combination of methods. In one study of Morse code, users started out by learning letters individual letters. As the time of usage increased, a symbolic language emerged and it then became hard to distinguish individual letters in transmission. Furthermore, users were able to recognize whole sentences using shorthand and able to perform simultaneous translation of speech in addition to decoding Morse messages (Tan, 1997). Fingerspelling, a tactile language where the pressure and movement of one hand is received on another hand, is another example of a tactile language that has capabilities for both symbolic and alphabetic language. COUPLING OF TOUCH WITH SENSORY MODALITIES While the sole sense of touch in communication can be effective (Geldard, 1967), the use of touch in combination with other senses reinforces perception and communication. The following is a brief overview of research on tactile stimulation in combination with other sensory modalities. Touch and Audition Audition is typically a broadcast medium. By coupling the audio channel with the private sense of touch, information can draw more attention. Research on vibrotactile devices such as the Tactaid and Tactuator (Tan, 1996), show that speech recognition increases dramatically when audio and touch input are combined in speech reading (Reed 1995, Tan 1997). Tadoma is a method of speech reading using touch, where the receiver places his thumbs on the lips of the speaker, with fingers on the throat. Intonation information was available from the vibrotactile stimuli at levels reliably about 70% (Aeur, et.al, 1999). Touch and Smell Experiments on smell and memory prove that there is a connection between memory recall and smell (Aggleton, 1998). Herz has done much work on exploring the coupling of touch and smell in emotion. Ehrlichman (1998) has used smell to recreate moods. Scratch-and- Sniff technology is a fad technology that is no longer popular. Recently, a company called DigiSense has introduced a device for coupling smell with web browsing.