Audio and Speaker Spatial Sound Perception AES 16 th International conference on Spatial Sound Reproduction 1 LISTENING TO ROOMS AND OBJECTS RICHARD MCGRATH 1 , THOMAS WALDMANN 2 , MIKAEL FERNSTRÖM 3 1 Department of Manufacturing and Operations Engineering, University of Limerick, Ireland richard.mcgrath@ul.ie 2 Department of Manufacturing and Operations Engineering, University of Limerick, Ireland thomas.waldmann@ul.ie 3 Interaction Design Centre, Department of Computer Science and Information Systems, University of Limerick, Ireland mikael.fernstrom@ul.ie This paper describes a series of experiments with blind and sighted people in a "thinking-aloud" study, attempting to identify our ability to describe the properties and their own location in two different rooms and the properties and locations of three different objects based on auditory cues generated by the participants themselves. INTRODUCTION To develop models for virtual sound that can enhance user interaction in virtual reality applications, we need to expand our understanding of what we hear and how we hear [15,16]. In virtual immersive environments the addition of virtual interaction sounds can assist users in their understanding of these environments, especially for judging size and location of objects in a virtual space. To investigate the basis for this, on the human side, we designed an experiment with both blind and blindfolded participants to perform localisation tasks. Five blind and seven sighted participants took part in this study. All participants were adults and reported normal hearing ability. Three of the blind participants were blind for more than twenty years, one was blind for twelve years, and one was blind for three years. Only the latter participant had some light perception. Sighted participants were blindfolded throughout the experiments. There were two kinds of task in the experiments. Task 1 was to give a description of one small room and of a large concert hall. For this task, participants were asked to use feedback from their own voices to describe the rooms, and to answer twelve questions about each room. Performance was videotaped, and the videotape was analysed with regard to eight variables, such as height, walls, etc. Task 2 was to describe three objects in each of the rooms: a sheet of aluminium, a sheet of aeroboard, and a leather football. Each object was placed in various locations to the left, right, or centre of the participant. It was also placed at three arbitrary locations at various distances away from the participants. 1. BACKGROUND Blind people must depend on non-visual senses for information to help them locate and identify objects and persons. Human auditory spatial acuity is quite poor, when compared to foveal acuity. It is not surprising then that the visual system is generally treated as the optimal sensory channel for the acquisition of spatial information. However, although vision is the dominant sensory mode in sighted humans, audition is important, and an interaction between the two modalities is common. The ability to localise sound sources is of considerable importance. This ability enables us to determine the location of objects, and indicates the appropriate orientation in which to “face”, in order to see the object. The term localisation is used to describe judgements of direction and distance. The ability to localise sound sources is of considerable importance to humans. Visually impaired people need to have a well- developed sense of space in order to navigate safely around their environment. It can be assumed, therefore, that visually impaired people may be more efficient at investigating their environment on the basis of sound alone, compared to sighted people, for whom sound localisation is less important. If we can ascertain,