A 2D Vibration Array as an Assistive Device for Visually Impaired D. Dakopoulos', S. K. Boddhu2, N. Bourbakis' College of Engineering and Computer Science 'Assistive Technologies Research Center, 2Computational Autonomy Research Lab Wright State University Dayton, OH, USA dakopoulos.2(wright.edu, nikolaos.bourbakis(wright.edu Abstract-This paper deals with the design, simulation and implementation of a 2D vibration array used as a major component of an assistive wearable navigation device for visual impaired. The 2D vibration array consists of 16 (4x4) miniature vibrators connected to a portable computer, which is the main computing component of the entire wearable navigation system, called Tyflos. Tyflos consists of two miniature cameras (attached to a pair of dark glasses), a microphone, an ear speaker, the 2D vibration array, and a portable computer. The cameras capture images from the surrounding environment and after appropriate processing 3D representations are created. These 3D space representations are projected on the 2D array, which vibrates in various levels corresponding to the distances of the surrounding obstacles. The 2D array is attached to the user's chest in order to provide the appropriate sensation (via vibrations) of the distances from the surroundings. Keywords-blind's navigation; visually impaired; vibrotactile; formal language; wearable system I. INTRODUCTION According to N.F.B. (National Federation for Blind), the estimated number of blind & visually impaired people of all ages in the U.S., (including institutionalized and homeless people) is 5-6 millions, thus the need to for assistive devices was and will be important to them. One of the assistive systems developed by researchers and practitioners is the navigation devices [1-16, 21]. There is a wide range of navigation systems and tools available for visually impaired individuals. White cane and dog guides are the most popular. Since 1960's evolving technology helped many researchers built electronic navigation devices. There are three categories of navigation systems [21]: i) vision enhancement, ii) vision replacement and iii) vision substitution. Vision enhancement involves input from a camera, processing of the information, and output on a visual display. In its simplest form it may be a miniature head-mounted camera with the output on a head-mounted visual display (as used in some virtual reality systems). Vision replacement involves displaying the information directly to the visual cortex of the human brain or via the optic nerve. Vision substitution is similar to vision enhancement but with the output being non-visual - typically tactual or auditory or some combination of the two. ETAs (Electronic Travel Agencies) belong to the vision substitution systems. They transform information about the environment that would normally be relayed through vision into a form that can be conveyed through another sensory modality. Our navigation system (Tyflos) belongs to the category of vision substitution and one of important module is a 2D vibration array, which offers to the blind-user a sensation of the 3D surrounding space. Thus, here we present the 2D vibration array device and its advantages and disadvantages regarding the information that provides to the user. The paper is organized into seven sections. Section 2 offers a brief description of the Tyflos system. Section 3 presents the modeling of the 2D vibration array based on a formal language. Section 4 describes the hardware implementation of the 2D array. Section 5 discusses the high to low resolution representation at the 2D array and information representation issues. Section 6 shows some experimental results and section 7 concludes the overall presentation with future work. II. THE TYFLOS NAVIGATION SYSTEM A. The First Prototype The main role of the Tyflos mobility assistant is to capture the environmental data from various sensors and map the extracted and processed content onto available user interfaces in the most appropriate manner. The Tyflos prototype will integrate a wireless handheld computer, cameras, range sensors, GPS sensors, microphones, natural language processor, text-to-speech device, and a digital audio recorder. The audio-visual input devices and the audio output devices can be worn (or carried) by the user. Data collected by the sensors is processed by the Tyflos' modules, each specialized in one or more tasks. In particular, it interfaces with external sensors (such as GPS, if applicable, range sensors, etc.) as well as the user, facilitating focused and personalized content delivery. The user communicates the task of interest to the mobility assistant using a speech-recognition interface. The preliminary design and development of the Tyflos prototype is already being carried out by the authors, "Fig. 1". This prototype consists of two cameras, a range scanner, an ear speaker, a microphone, a speech synthesizer, and a portable computer. This device has been evaluated by students with visual disabilities and their feedback has been used in the 1-4244-1509-8/07/$25.00 02007 IEEE 930