Towards a One-Way American Sign Language Translator R. Martin McGuire , Jose Hernandez-Rebollar , Thad Starner , Valerie Henderson , Helene Brashear , and Danielle S. Ross GVU Center Engineering and Applied Science Brain and Cognitive Sciences Georgia Tech George Washington University University of Rochester Atlanta, GA 30332 Washington, DC 20052 Rochester, NY 14627 haileris,thad,vlh,brashear jreboll@gwu.edu psycholing@earthlink.net @cc.gatech.edu Abstract Inspired by the Defense Advanced Research Projects Agency’s (DARPA) recent successes in speech recognition, we introduce a new task for sign language recognition re- search: a mobile one-way American Sign Language trans- lator. We argue that such a device should be feasible in the next few years, may provide immediate practical benefits for the Deaf community, and leads to a sustainable program of research comparable to early speech recognition efforts. We ground our efforts in a particular scenario, that of a Deaf individual seeking an apartment and discuss the system re- quirements and our interface for this scenario. Finally, we describe initial recognition results of 94% accuracy on a 141 sign vocabulary signed in phrases of fours signs using a one-handed glove-based system and hidden Markov mod- els (HMMs). 1. Introduction Twenty–eight million Deaf and hard–of–hearing individ- uals form the largest disabled group in the United States. Everyday communication with the hearing population poses a major challenge to those with hearing loss. Most hear- ing people do not know sign language and know very little about deafness in general. For example, most hearing peo- ple do not know how to communicate in spoken language with a Deaf or hard–of–hearing person who can speak and read lips (e.g. that they should not turn their head or cover their mouths). Although many Deaf people lead success- ful and productive lives, overall, this communication bar- rier can have detrimental effects on many aspects of their lives. Not only can person–to–person communication bar- riers impede everyday life (e.g. at the bank, post office, or grocery store), but essential information about health, em- ployment, and legal matters is often inaccessible. Common current options for alternative communication modes include cochlear implants, writing, and interpreters. Cochlear implants are not a viable option for all Deaf peo- ple. In fact, only 5.3% of the deaf population in America has a cochlear implant, and of those, 10.1% of these individu- als no longer use their implant (complaints cited are similar to those of hearing aides) [2]. The ambiguity of handwrit- ing and slowness of writing makes it a very frustrating mode of communication. Conversational rates (both spoken and signed) range from between 175 to 225 WPM, while hand- writing rates range from 15 to 25 WPM [5]. In addition, English is often the Deaf person’s second language, Ameri- can Sign Language (ASL) being their first. Although many Deaf people achieve a high level of proficiency in English, not all Deaf people can communicate well through written language. Since the average Deaf adult reads at approxi- mately a fourth grade level [1, 9], communication through written English can be too slow and often is not preferred. Interpreters are commonly used within the Deaf com- munity, but interpreters can charge high hourly rates and be awkward in situations where privacy is of high concern, such as at a doctor or lawyer’s office. Interpreters for Deaf people with specialized vocabularies, such as a PhD in Me- chanical Engineering, can be difficult to find and very ex- pensive. It can also be difficult to find an interpreter in un- foreseen emergencies where timely communication is ex- tremely important, such as car accidents. 2. The One-Way Translator Our goal is to offer a sign recognition system as another choice of augmenting communication between the Deaf and hearing communities. We seek to implement a mobile, self– contained system that a Deaf user could use as a limited in- terpreter. This wearable system would capture and recog- nize the Deaf user’s signing. The user could then cue the