GeKiPe, a gesture-based interface for audiovisual performance José-Miguel Fernández IRCAM Paris, France jose.miguel.fernandez @ircam.fr Thomas Köppel Haute Ecole d’Art et de Design de Genève Geneva, Switzerland thomas@werkstadt.ch Nina Verstraete Flashback Ensemble Perpignan, France nina.verstraete.flashback @gmail.com Grégoire Lorieux IRCAM Paris, France gregoire.lorieux@ircam.fr Alexander Vert Flashback Ensemble Perpignan, France alexander.vert.flashback @gmail.com Philippe Spiesser Haute Ecole de Musique Geneva, Switzerland philippe.spiesser@hesge.ch ABSTRACT We present here GeKiPe, a gestural interface for musical ex- pression, combining images and sounds, generated and con- trolled in real time by a performer. GeKiPe is developed as part of a creation project, exploring the control of virtual in- struments through the analysis of gestures specific to instru- mentalists, and to percussionists in particular. GeKiPe was used for the creation of a collaborative stage performance (Sculpt), in which the musician and their movements are captured by different methods (infrared Kinect cameras and gesture-sensors on controller gloves). The use of GeKiPe as an alternate sound and image controller allowed us to com- bine body movement, musical gestures and audiovisual ex- pressions to create challenging collaborative performances. Author Keywords audiovisual performance, motion capture, gesture recogni- tion, gestural control, gloves, visualization, spatialization ACM Classification H.5.1 [Multimedia information systems], H.5.2 [User Inter- faces], H.5.5 [Sound and Music Computing] 1. INTRODUCTION Most musical activities (e.g. performance, conducting, danc- ing) involve body movements or gestures. Musical gestures can be studied based on their spatial aspects, functional aspects, their use in performances (as communication or control tools), or for metaphoric artistic purposes. The re- cent advancements in computing, electronics and sensors technologies resulted in growing interests in new musical in- terface designs, allowing researchers and artists to address questions about movement and gestures in a musical con- text. Musical gestures can be interpreted as the intersection between observable actions and mental images [4]. They can be studied at various levels, ranging from the purely func- tional to the purely symbolic, whether we consider them as Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). Copyright remains with the author(s). NIME’17, May 15-19, 2017, Aalborg University Copenhagen, Denmark. . Figure 1: GeKiPe (Geste, Kinect, Percussion). effective (sound producing), accompanying (supporting the effective gesture) or for more figurative cues [3]. An analo- gous definition is suggested by [8] who state that a ”gesture is a movement or change in state that becomes marked as significant by an agent. [...] For a movement or sound to be(come) gesture, it must be taken intentionally by an inter- preter, who may or may not be involved in the actual sound production of a performance, in such a manner as to donate it with the trappings of human significance.”In other words, musical gestures should be meaningful and carry significant information (communication, control, metaphoric). The GeKiPe (Geste, Kinect, Percussions) interface was developed in 2015 as a creation and research project whose main interest is the exploration and the control of virtual in- struments based on the analysis of gestures, specific to per- cussionists. Built as an interdisciplinary approach involving professional players, composers, music programmers and vi- sual artists, GeKiPe aims at concrete musical and audiovi- sual applications, with special attention on sound, visual, and gesture qualities. The GeKiPe project was initiated with the objective of improving musician’s gesture quality and fineness. Our approach is to achieve this through con- tinuous fine-tuned controls and sound synthesis, rather than using the traditional ”on/off” system in which sounds are 450