O’Sullivan & Boland Visualizing & Controlling Sound with Graphical Interfaces AES 41st International Conference, London, UK, 2011 February 2–4 1 VISUALIZING AND CONTROLLING SOUND WITH GRAPHICAL INTERFACES LIAM O’SULLIVAN, FRANK BOLAND Dept. of Electronic & Electrical Engineering, Trinity College Dublin, Dublin 2, Ireland lmosulli@tcd.ie Developments in abstract representations of sound from the field of computer music have potential applications for designers of musical computer games. Research in cognition has identified correlations in the perceptions of visual objects and audio events; - experiments show that test subjects associate certain qualities of graphical shapes with those of vocal sounds. Such 'sound symbolism' has been extended to non-vocal sounds and this paper describes attempts to exploit this and other phenomenon in the visualization of audio. The ideas are expanded upon to propose control for sound synthesis through the manipulation of virtual shapes. Mappings between parameters in the auditory and visual feedback modes are discussed. An exploratory user test examines the technique using a prototype system. INTRODUCTION The popularity of certain music-based computer games highlights the approach to the visual representation of music and sound in virtual environments. Games like Guitar Hero 1 In section 1, examples of approaches to the sound synthesis GUI are described. Section 2 discusses aspects of perceived relationships between visual and auditory stimuli, including sound symbolism. Some examples of software applications using such perceptual links are presented in section 3. Section 4 describes a prototype interface used in an exploratory study into particular sound-shape relationships and outlines the subjective test undertaken. Section 5 discusses the results of the experiment. Section 6 proposes future work and a conclusion is offered in section 7. provide an engaging experience through a note-entry-type task demanding high temporal precision. However, no commercially available games exploit the ability of the modern computer to manipulate and control musical timbre in real time. This paper outlines an approach to the representation and control of timbre through the provision of an effective graphical user interface (GUI). 1 SOUND CONTROL INTERFACES In the fields of computer music and audio production, GUIs take a number of common approaches [9]. Some emulate hardware devices e.g. a virtual synthesizer may use knobs, sliders and similar interface widgets placed on a graphical background (Fig. 1). These assume the user has specific operational knowledge of the original device (e.g. the effect of modifying a particular synthesis parameter) or possesses familiarity with a 1 http://hub.guitarhero.com/ learned convention (such as the pitch distribution of the piano keyboard). A number of more experimental GUI designs employ interactive widgets as a means to control sound (Fig. 2); - these commonly represent the sound control parameters in some way, so the user is aware of the underlying system state. While many synthesis parameters are available in the above examples, the interfaces take the legacy windows, icons, menus, pointer (WIMP) format. This arrangement is not particularly suited to real-time musical play as it fosters an analytical approach to sound control, effectively decomposing the output sound into separate parameters [6]. The common one-to-one input-output mapping is not the most engaging for musical tasks. Although they are less intuitive for a beginner, complex mappings from more-than-one input to more-than-one output are more absorbing [2]. This has obvious ramifications for the design of musical games, where fun and engagement are of vital importance. The representation and control of sound synthesis using virtual objects allows simultaneous modification of multiple parameters [10]. By representing parameters intuitively, the cognitive load associated with musical play may be reduced. One approach to this is the abstract representation of the output sound. Software that visualises sound data in different ways has historically been of interest [8] and Levin provides a good overview of GUI approaches [9]. This research focuses on ways to intuitively link simple graphical objects and perceived sound qualities. Some efforts to find such innate relationships are described next.