Proceedings of the 17th Sound and Music Computing Conference, Torino, June 24th – 26th 2020 255 DIGITALLY ENHANCED DRUMS: AN APPROACH TO RHYTHMIC IMPROVISATION Matteo Amadio Conservatory of Music “C. Pollini”, Padua matteoamadio@outlook.com Alberto Novello Conservatory of Music “C. Pollini”, Padua albynovello@gmail.com ABSTRACT This paper presents a set of real-time modules that digi- tally enhance the performance of a drummer. The modules extract rhythmic information from the multichannel audio acquired using simple microphones onto the different drum parts. Based on the predicted tempo, the modules generate complex patterns that can be manually controlled through high-level parameters or can be left automatic while the system adapts to the playing condition of the drummer. Such an interactive system is intended mainly for an im- provised solo performance confronting a human drummer with a computer; however, it could be effectively employed in improvisations with larger ensembles or installations. 1. HISTORICAL BACKGROUND AND RELATED WORK In the early years of electronic music, when the new tech- nologies began to be considered as valid and innovative means for musical expression, percussion instruments had a fundamental role in the transition between the traditional, keyboard-influenced musical language and the all-sound music of the future [1]. Several composers considered per- cussion a great match with electronics because of their common characteristics. Both instrument families can gen- erate a broad spectrum of unpitched sounds which encour- aged composers to focus on timbre and rhythm. Another aspect is the modularity that characterises both setups of the percussionist and the electronic musician: the possi- bility to swap parts of an instrument chain has an impact on the creation process, and opens new possibilities in the performative approach [1] [2] [3]. In the following years, the diffusion of digital technolo- gies, computers and real time digital audio processing al- lowed the birth of new strategies for composition and live electronics. In the late 70’s Chadabe started experiment- ing with digital interactive composing systems: algorithms capable of responding in complex ways to the actions of the performer [4]. In his pieces Solo and Rhythms, for example, the performer provides high-level input data to the system which elaborates them in order to produce the whole musical output. The performer influences the final result but is unable to control each single event, placing Copyright: c 2020 Matteo Amadio et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 Unported License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. himself in a position where he needs to listen and react to the gestures of the computer. In 1983 George Lewis was working at IRCAM developing an interactive software that could automatically generate instrumental music and also analyse the performance of human musicians in order to play along with them [5]: this work established the base of his Voyager system. The idea of independence of the com- puter from the human performer led to the abolition of the “human leader/computer follower” hierarchy, in order to create the possibility to communicate only using the musi- cal language [6]. As a result, some intentions and emotions expressed by the human performer could also be found in the electronic performance, confirming the achievement of an authentic man-machine musical interaction. This last concept was pointed out also by Robert Rowe in the de- scription of one of his early works in the field of human- machine musical interaction, Hall of Mirrors [7]. Rowe describes the feedback loop generated by mutual imitation as two (or more) mirrors facing each other. Rowe devel- oped his own interactive system named Cypher. In Cypher the user has to decide how the listener will interact with the player, allowing a high-level control on the actions of the software [8]. In the ‘90s both Cypher and Voyager were modified to include the use of MIDI data as input and out- put, in order to easily process all the necessary data [9]. In recent years, great progress has been made in the de- velopment of new interactive music systems and digital tools for the analysis of human performances. B-keeper is a real-time beat tracker specifically designed for live per- formance [10]. The software analyses the audio coming from the kick microphone of a drum set and syncs to the drummer’s tempo. It creates the possibility to play pieces with pre-recorded audio or MIDI sequences without the constrictions imposed by the “fixed media”, allowing the musicians to shift the performance tempo. Every temporal interval analysed by the system is weighted using prob- ability distributions in order to discard non relevant data or reduce the impact of human errors on the stability of the system. By doing that, the algorithm privileges reg- ularity over sudden changes and, as a consequence, it is less reactive in some musical situations with drastic tempo changes. Jeff Gregorio follows a completely reversed ap- proach by considering the drum set as a whole polyphonic resonant system [11]. By applying electromagnets on the drum heads he sends synthesized sounds to activate the vi- bration of the drums: the software allows direct control over the tones that will be generated by the drums. The system can listen to the performance of a melodic instru-