Mapping Strategies and Sound Engine Design for an Augmented Hybrid Piano Palle Dahlstedt Dept. of Applied IT & Academy of Music and Drama, Univ. of Gothenburg, Sweden Dept. of Comm. and Psych., Aalborg Univ., Denmark palle.dahlstedt@gu.se, dahlstedt@hum.aau.dk ABSTRACT Based on a combination of novel mapping techniques and carefully designed sound engines, I present an augmented hybrid piano specifically designed for improvisation. The mapping technique, originally developed for other control interfaces but here adapted to the piano keyboard, is based on a dynamic vectorization of control parameters, allowing both wild sonic exploration and minute intimate expression. The original piano sound is used as the sole sound source, subjected to processing techniques such as virtual resonance strings, dynamic buffer shuffling, and acoustic and virtual feedback. Thanks to speaker and microphone placement, the acoustic and processed sounds interact in both direc- tions and blend into one new instrument. This also allows for unorthodox playing (knocking, plucking, shouting). Pro- cessing parameters are controlled from the keyboard play- ing alone, allowing intuitive control of complex processing by ear, integrating expressive musical playing with sonic exploration. The instrument is not random, but somewhat unpredictable. This feeds into the improvisation, defining a particular idiomatics of the instruments. Hence, the instru- ment itself is an essential part of the musical work. Perfor- mances include concerts in UK, Japan, Singapore, Australia and Sweden, in solos and ensembles, performed by several pianists. Variations of this hybrid instrument for digital keyboards are also presented. Author Keywords augmented instrument, piano, keyboard, mapping, hybrid instrument, performance, improvisation ACM Classification H.5.5 [Information Interfaces and Presentation] Sound and Music Computing H.5.2 [Information Interfaces and Presen- tation] User Interfaces — Auditory (non-speech) feedback 1. INTRODUCTION During the last few years I have run a research project with the goal to design electronic instruments for free improvisa- tion meeting the following criteria: 1) They should be free of presets, but with an easily operated mechanism for real time exploration of the space of possible sounds. 2) There should be a correlation between physical effort and sound Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. NIME’15, May 31-June 3, 2015, Louisiana State Univ., Baton Rouge, LA Copyright remains with the author(s). production, and a change in gestural input should corre- spond to a change in sonic gestural output. 3) They should be as direct and free in the interaction as acoustic instru- ments are, and the user should be able to develop a skill and musicianship over time. This project has resulted in a family of synthesis-based instruments using various interfaces and playing styles, e.g. instruments using arrays of pressure sensors, percussion con- trollers (pitched and non-pitched), MIDI keyboards [1] and even the GuitarHero controller [3]. The main break-through in this project has been the introduction of an unconven- tional mapping approach, where the control parameters are dynamically mapped to initially randomized vectors in syn- thesis parameter space. The vector system can be re-scaled and shifted on the fly, by ear, allowing for control of complex explorations and trajectories in a high dimensional space. In this paper, a particular development within this project is described: Foldings, an implementation of a hybrid pi- ano instrument, integrating the acoustic sound of the grand piano with processed sound from the same source. The processing is controlled by the pianist’s playing on the key- board, through the aforementioned vectorization algorithm. The electronic sounds are projected from speakers right be- hind the piano, causing the two sounds domains to merge into a new hybrid instrument that behaves organically. It also responds well to non-piano sounds, such as knocks on the wood, shouts into the piano, preparations or other inside-the-piano playing techniques. 1.1 Background and previous art It is natural for musicians and composers to push the lim- its of their instruments, trying to extend the sound and performance possibilities. This is often done in collabora- tion with the instrument maker (e.g., as throughout the development of the modern piano, with involvement of J.S. and J.C. Bach, Beethoven, Liszt, Alkan and many other pianists), and new techniques that are initially perceived as an abuse of the instrument may later be encouraged and enhanced, or even explicitly supported, in a newer ver- sion of the instrument. But as the design of the piano has not changed much during the last century, composers have turned to extended techniques (playing inside the piano, di- rectly on the strings or on the wooden parts), preparations with physical objects so that the strings, although played in normal ways, sound like completely new instruments — inharmonic, percussive, noisy or bell-like — as pioneered by John Cage. Later, composers added pre-recorded sounds to the piano sound, or modified the acoustic sounds through electronic means, as in, e.g. Stockhausen’s Kontakte (with prerecorded tape part complementing the acoustic piano and percussion) and Mantra (using sine wave ring modu- lation on live piano sounds). The current repertoire for piano and live electronics is 271 Proceedings of the International Conference on New Interfaces for Musical Expression, Baton Rouge, LA, USA, May 31-June 3, 2015