Kinect-Via- : Max/MSP Performance Interface Series for Kinect’s User Tracking via OSC Jon Bellona University of Oregon Digital Arts Department bellona@uoregon.edu http://deecerecords.com ABSTRACT In this paper, I describe a Max/MSP interface series (Kinect-Via- ) for composers wanting to route and map user-tracking data from the XBox Kinect. The interface series complements four different OpenNI applications, namely OSCeleton, Synapse, Processing’s simple-openni library, and Delicode’s NIMate. All Max/MSP interfaces communicate using OSC (Open Sound Control) messages and are performance-ready, meaning that all routing and system options may be changed in real time. The Kinect-Via- interfaces offer a tangible solution for anyone wishing to explore user tracking with the Kinect for creative application. The aim of the paper is to discuss features of four different OpenNI applications, to address potential issues and challenges when working with the OpenNI framework, and to outline formative interface issues revolving around video tracking technology. Keywords: XBox Kinect, OpenNI, Max/MSP, Live Interface, Open Sound Control, OSC, OSCeleton, Synapse, Processing, Delicode, NIMate, Kyma. 1. INTRODUCTION The objectives of the Kinect-Via- interface series are threefold: to serve as a ready-made composition tool; to save users time in building a mapping framework between compositions; and, to act as a performance interface. Once Kinect drivers have been installed and the appropriate interface downloaded, 1 the Kinect-Via- interfaces handle all incoming OSC messages from respective software automatically. 2 The Kinect-Via- interface series provides ready-to-use data mapping objects inside Max/MSP, and each interface provides controls for communicating with their respective OpenNI application. Before jumping into technical specifics, I will first contextually discuss the need for a Kinect interface. Next, I will analyze features of the OpenNI framework by highlighting four OpenNI applications. Third, I will discuss potential issues of video tracking through a real- world implementation of a Kinect-Via- interface. As a guide to the reader, all term definitions used throughout the paper may be found in section 8. 1 All interfaces may be downloaded at: http://deecerecords.com/kinect 2 I bundled a comprehensive sketch for use with Processing, while all other software automatically transmit Kinect tracking skeletons. User tracking is a reality that offers many rewards, but not without its challenges. My hope is that the Kinect-Via- interface series hastens your discovery of new works and furthers your exploration of 3D space. At the time of this writing, the Kinect-Via- interface series has been downloaded over 1,000 times. [5] 2. THE INTERFACES Anyone wishing to skip this article and jump right in with the open source interfaces may do so. The various interfacesKinect-Via- OSCeleton, Synapse, Processing, NIMate may all be downloaded at: http://deecerecords.com/kinect Figure 1. Kinect-Via-Synapse interface. 3. WHAT’S THE BIG DEAL? Tracking users in space is not a new concept. David Rokeby’s Very Nervous System tracked users with the computer as far back as 1982,[4] and many other video and light tracking systems have been introduced since. 3 What’s stimulating about the Kinect is the affordability of tracking multiple users in 3D space, especially providing joint coordinates that may serve as real-time controllers. Ever since the initial Kinect ‘hack,’ [1] there have been hundreds of projects featuring Kinect user tracking, 4 and one-step installs have enabled quick 3 Tracking systems like Imago, smart Junior, and EthoVision demonstrate available tracking systems, and software like Open CCV and Isadora take advantage of USB web-cameras. 4 One of my favorite Kinect projects is Robert Hodgin’s Body Dysmorphia. (http://roberthodgin.com/body-dysmorphia/ )