INSIDE_OUT: a case study for enhancing public participation in mixed reality events Marientina GOTSIS Electronic Visualization Laboratory University of Illinois at Chicago gotsis@evl.uic.edu Kang SUN Electronic Visualization Laboratory University of Illinois at Chicago kbs@evl.uic.edu Geoffrey BAUM Electronic Visualization Laboratory University of Illinois at Chicago xiccarph@evl.uic.edu Robynne H. GRAVENHORST School of Kinesiology University of Illinois at Chicago anatomical@juno.com Abstract. INSIDE_OUT took place on June 27th and 28th of 2003 at the University of Illinois at Chicago Electronic Visualization Laboratory, and on the AccessGrid. This case study examines a range of challenges and solutions to production and interaction design decisions for the mixed reality INSIDE_OUT performance event and the effectiveness of the solutions for transitioning passive audience members to active participation. The event featured live dancer improvisations by the Anatomical Theatre dance company based on the 'safety zone' choreographic exercise. EnergyComposer, a YG-based application provided the framework for a virtual performance space controlled through camera-based tracking of three different fluorescent color markers. The idiosyncrasies of the tracking system were successfully integrated into the aesthetic planning of the performance event. Two networked passive stereo displays were used to provide two different viewpoints of the virtual environment to the audience. In designing a multi-modal environment to accommodate a range of expert and inexperienced users, traditional wand navigation was disabled. Interactions that could fulfil both lengthy spatial exploration but required only a brief comprehension time were achieved by relying on full-body physical navigation for triggering events in a small, controlled environment. 1. Introduction The INSIDE_OUT event was based upon the EnergyComposer application and served as the primary exhibition of the MFA thesis of Marientina Gotsis [1] at the Electronic Visualization Laboratory (EVL) of the University of Illinois at Chicago. The event was designed to be a significant departure from previous EVL-hosted MFA shows in that it would provide visitors the opportunity to both watch and actively participate in a mixed reality event. In addition, this event was the first to feature an optical tracking system developed at EVL. This system provided tracking for two networked passive stereo displays (C-Wall [2] and GeoWall [3]) using Ygdrasil (YG). Ygdrasil is built in C++ around SGI's OpenGL Performerâ„¢ visual simulation toolkit [4] and the CAVERNsoft G2 networking library [5]. YG extends Performer's hierarchical scene graph representation of the virtual world database by making the scene graph shared (Figure 4). Figure 1. Lyndsae Rinio (left), LeAnn Vancil (middle) and Nadine Lollino (right)