Improving Web Accessibility of graphs for visually Impaired Dr.Arvinder Kaur 1 ,Diksha Dani 2 and Neeta Mishra 2 1 Guru Gobind Singh Inderprastha University,New Delhi,India 2 Inderprastha Engineering College,Ghaziabad,India Abstract:The information in the graphs is conveyed to visually impaired people through haptic and audio channels. This paper addresses the problems faced by visually impaired people in accessing graphical information on the Internet, particularly the common types of graphs. The model will take the input from the web pages automatically and will generate the output by integrating synthesized speech with non-speech sounds to help blind people comprehend the graphs. Keywords :accessibility,blind ,web,non speech sounds I.INTRODUCTION The exponential growth of Internet and Internet-related services, technological conveniences and advances have somehow made their way into our daily lives and have now become a major part of us . The implication of web boom is felt by all people, visually impaired being no exception. The idea of virtually having all the information you want at your finger tips is appealing for them too. Traditionally, blind people have been dependent on written information that has been translated into braille or audio books, which often take time to be produced. Through the Internet, new information is available immediately without delay. Assistive technology enables users to access websites. Two categories of assistive technology are used most by blind Internet users : screen readers and Refreshable braille displays .Screen readers are software that translates screen contents into synthetic speech. Refreshable braille displays are hardware devices containing a strip of retractable braille pins, allowing braille characters to be generated on the fly. There are many assisted technologies and multimodal interfaces that include JAWS from Freedom Scientific[27], Window Eyes from GW Micro[28], or LookOUT from Choice Technology , a basic screen reader, called Narrator[14] is included in the Windows 200 or XP operating system to help them access the information from the web. Screen reader application attempts to describe to the blind user in speech what the graphical user interface is displaying. Self-voicing applications are written specifically for blind people that provide their output through synthesized or recorded speech to help them access the information from the internet. One of the main problem they face while accessing the information on the internet is access the graph information. Blind and visually impaired people are often deprived of access to information due to the use of visualisations such as graphs. Data visualizations such as line graphs, bar charts and pie charts, etc, are presented to users through haptic and audio channels. This paper attempt to address this issue . II.RELATED WORK Excellent research into non-visual interfaces to graphs and charts has been carried out in the past decade to help solve the problem of data visualization by blind. The non-visual interfaces to graphs generally fall into four defined categories: * Sonification [5-9], *Haptic interfaces [10,11,12], * Hybrid systems [13,15] *NL interfaces [3,16,17] The above devices use sound (linguistic and non-linguistic), touch or both together to communicate graphical information to the user. The Soundgraphs[1] system allows the presentation of line graphs in sound. Time is mapped to the X-axis and pitch to the Y-axis. The shape of the graph can be heard as a rising or falling note playing over time. It allows listeners to get an overview by listening to all of the data very quickly. Haptic devices allow users to feel virtual objects. The haptic device called PHANToM from Sensable Technologies [7] is used to instantiate virtual objects. This is a very high resolution, 6 degrees of freedom device, consisting of a motor controlled jointed arm. Users operate the device by placing their finger in a thimble at the tip of the device. This affords a very natural interaction with the objects[22]. Another development in this , iGraph-Lite system provides short verbal descriptions of the information depicted in graphs and a way to also interact with this information[3].Evaluation of iGraph-Lite system is reported by Ferres et al [4]. III.PROPOSED WORK This work is the extension of our previous work in which we had proposed a architecture of web surfing model for blind people[23].In this paper we have implemented a part of the model using JAVA language. Research has shown that people are able to interpret line graphs that are sonified by representing each data point with a musical note [18-20].In this paper the XY graph is comprehended by blind people Arvinder Kaur et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 2 (5) , 2011, 1979-1981 1979