Proceedings of the 2003 International Conference on Auditory Display, Boston, MA, USA, 6-9 July 2003 ICAD03-307 A CASE STUDY IN THE DESIGN OF SOFTWARE THAT USES AUDITORY CUES TO HELP LOW VISION STUDENTS VIEW NOTES ON A BLACKBOARD Dave Berque, Terri Bonebright, Seth Kinnett, Nathan Nichols, Adam Peters DePauw University Greencastle, IN 46135 {deberque, tbone, skinnet, nnichols, apeters}@depauw.edu This ongoing project investigates the interplay between educational technology, pen-based computing, and auditory displays with respect to the design of assistive technology for low-vision students in a classroom setting. Specifically, we report on the design, implementation, and evaluation of a software system named v-VIS (Viewer for Visually Impaired Students) that addresses the problems low-vision students have in seeing material which may be written on a blackboard or overhead projector in a traditional classroom setting. Instead of writing extemporaneously on a blackboard or overhead projector, the instructor in a v- VIS enabled classroom uses an electronic stylus to write and sketch material freehand on the surface of an electronic video tablet. Material written on the video tablet is input into a computer projection system which displays it on a screen at the front of the room, thereby allowing fully sighted students to view the material much as they would if the instructor was writing on an overhead projector. The instructors writing is simultaneously transmitted to a computer located at the low-vision student's desk where this material is displayed using color adjustment, zoom, and audio cues. Several distinct auditory cues are used to inform the low-vision student when new material begins to arrive on the screen, how long this material continues to arrive, and which region of the screen the material is being displayed on. Work on the v-VIS system has been informed both by a formal user study comparing several different audio cue designs, and by gathering feedback from a low-vision student who used the system in a semester-long statistical methods course. SONIFICATION OF DYNAMIC CHOROPLETH MAPS: GEO-REFERENCED DATA EXPLORATION FOR THE VISION-IMPAIRED Haixia Zhao, Catherine Plaisant, Ben Shneiderman Department of Computer Science and Human Computer Interaction Laboratory University of Maryland College Park, MD 20742 {haixia, plaisant, ben}@cs.umd.edu Dmitry N. Zotkin, Ramani Duraiswami Perceptual Interfaces and Reality Laboratory, UMIACS University of Maryland College Park, MD 20742 {dz, ramani}@umiacs.umd.edu Interactive data visualization tools are helpful to gain insight about data, find patterns and exceptions, but are usually inaccessible to vision-impaired users. In the case of geo-referenced data where users need to combine demographic, economic or other data in a geographic context for decision-making, we designed YMap, a dynamic choropleth map tool that visualizes data attributes on the choropleth map and enables slider-based dynamic queries. User studies show that YMap can help users find specific geographic regions that match a query and retrieve details, find trends and patterns or detect the correlation between attributes. As our first step to design a multimodal (audio+haptic) counterpart exploration tool for the vision-impaired, we created a virtual spatial sound display for the interactive map by synthesizing 3-D sounds of various timbres and pitches using head-related transfer function (HRTF) and tying these sounds to map regions and interface widgets. The 3-D sounds create the effect of a virtual map hung on the surface of a large virtual sphere with the user sitting in the center. Three audio interactions have been implemented: (1) gliding the cursor over the map to examine the sound of individual regions; (2) adjusting dynamic query sliders and hear the sounds of regions being filtered-out / filtered-in; and (3) using sweeping lines to scan the map and hear the sound patterns. We designed an interface using either keyboard or tablet. Our research goals are to identify effective sonification mechanisms, especially as applied to dynamic choropleth maps, explore coupled use of tactile perception with sound for maps, and examine the effectiveness of our tool in helping vision-impaired users in large geo-referenced data set exploration. We also want to investigate the sonification of maps for sighted users to use over the telephone or as a complement to visual modes.