Body-Prop Interaction: Evaluation of Augmented Open Discs and Egocentric Body-Based Interaction Rajiv Khadka * University of Wyoming Amy Banic † University of Wyoming, Idaho National Laboratory Figure 1: A user using 3D printed open-discs, Body-Prop, on his left hand to have a direct manipulation of with the dataset (left) and the user’s view of the augmented information on top of the each open-disc shown as seen using augmented reality headset (right). ABSTRACT This paper presents a novel interaction technique to combine multi- ple inputs and output modalities with a 3D printed open disc which is tracked and augmented with a virtual information. Props have been used in virtual and augmented environments to provide tangible associations to virtual elements. When used in a larger immersive environment or where a user is walking around, users may not have the capability to set these objects/props down or are limited to the number of hands, so users are reduced to the capability of using only two props or controllers. In our technique, users can instead use their body to store and organize the virtual objects associated with tangible props by physically hanging them on or attaching them to the body. This type of interaction is well-suited for both immersive visualizations and immersive virtual environments. In this paper, we present the results of an experimental evaluation of our Body Prop Interaction technique for each type of immersive environment and corresponding task type, demonstrating more effective usage in each scenario than a joystick or gesture-based interaction. Index Terms: Human-centered computing—Mixed / augmented reality——Human-centered computing—User studies Human- centered computing—Usability Testing— 1 I NTRODUCTION Larger immersive environments, such as a CAVE, use of a wide- area tracked head-mounted display, or inside-out augmented reality tracking system enable capabilities and benefits of real walking and direct interaction with a virtual or visualization environment. However, this type of system presents a few challenges as well. Users are limited to what they can carry in their hands, two joysticks for example, to interact with the environment. Previous research has shown that physical props benefit interaction modalities [9]. Furthermore, we know that in cognitive psychology literature, users can associate objects with information to better remember where those objects are located therefore improving interaction where tasks might involve many components and/or states that the user would have to remember without this capability [7]. Our novel technique aims to provide the user with additional capabilities for prop-based * e-mail: rkhadka@uwyo.edu † e-mail: abanic@cs.uwyo.edu interaction while the user is mobile and in a physically wide-open immersive system. There are additional challenges in immersive visualizations that our technique aims to solve as well. Immersive data visualization, described as 3D datasets presented in virtual environments, has the potential to enhance a users ability to understand and discover feature and novel information in large and complex spatial data [8, 11]. Direct manipulation and interaction can improve the exploration and analysis of three-dimensional (3D) datasets and therefore improve scientific workflow and discovery [11,17]. One challenge is to design direct manipulation interaction that encourages flexible exploration yet helps to organize and maintain spatial and visual relationships (important for data exploration). In previous work, Prop-based interaction and tangible user interfaces have been separately used for interacting with visualizations [2, 6, 10, 15, 19, 21].However, our technique uniquely combines virtually augmented props with interaction and engagement of a users own body in an egocentric reference frame with the immersive visualization. We present Body-Prop Interaction, a novel tangible multimodal interface, using augmented open discs and mid-air hand gestures for interaction with immersive visualizations. Our direct interaction technique combines rigid body augmented reality tracking marker, 3D printed open discs and augmented reality headset to enable direct exploration of an immersive visualization, maintain spatial and visual relationships, and provide an engaging user experience. Users can directly manipulate the volume visualization of 3D datasets and see the real-time update in an augmented reality (AR) environment. A user can assign or map portions of the volume visualization to each disc to manipulate the view of the visualization. In this paper, we also present a performance and usability evalua- tion to compare our novel interaction technique, Body-Prop, with the existing interaction techniques, gestures and joystick controller. Par- ticipants completed each of two different task types (data exploration task and mid-air assembly task) separately using each interaction technique: Body-Prop, gesture-based, and joystick. We found that our novel Body-Prop interaction technique increases the efficiency performance of scientific workflow when exploring or searching for unique data features within immersive visualizations of 3D datasets in comparison to gesture and joystick controller. We found that using Body-Prop interaction technique enables better performance and usability during an assembly task compared to gestures for an assembly task. These results show that use of Body-Prop Interaction provides task performance benefits for interacting with both immer- sive data visualizations and virtual environments. This technique as the potential to then further increase scientific workflow discovery and enable more efficient tasks in virtual environments.