Abstract When interacting in the real world, it is common to remember locations of objects based on our own body previous locations and postures. In this paper we discuss the benefits of the whole-body awareness in 3d interactive applications. We propose a technique for navigation and selection in 3d environments which uses an optical tracking system to provide whole-body position and orientation. We explore the peephole metaphor with a tablet pc to artificially enlarge the display and interaction area. In the application implemented to prove the proposed concepts, the tablet is held by the participant who moves it around and points it in any direction for visualization and interaction. 1. Introduction When exploring a 3d virtual environment with a mouse and keyboard, users easily become disoriented. Jacob et al. [4] remind us that we are not guided purely by visual cues when moving in the real world. We are also guided by some understanding of the surrounding environment, of our body and the presence of other people, as well as by some sense about physics. Thinking of navigation in real world is relatively easy as our complex biological system provides important information to aid in self orientation. The human vision is capable of providing stereo perspective views of the world, giving a notion of position and distance from visible objects. The labyrinth provides the information about up, down and balance. Our sense of touch makes us aware of obstacles when we are in direct contact, even when we cannot see them. To finish, we dispose of a sense of position and orientation which makes us know, all the time, where our limbs and other body part are, which is called proprioception [2]. However, when the focus passes from the real world into a virtual world, and one starts to interact with a virtual environment using conventional interfaces, all corporal cues vanish. This often reduces body sensation and causes disorientation. Despite the fact that there is a 3d view, normally there is no information to guide us other than what we see on the display. In the best cases a stereo view is available, but generally only two-dimensional mini-maps of the environment and the keys we press to change what we see are provided. To illustrate that, think of playing a first person shooting game in a regular personal computer. There is a strong dissociation between vision and movement, i.e., the virtual eye/camera position and orientation in space is controlled by key pressing, while in fact we are still sitting there in front of the screen. In the present work we explore human orientation capabilities without relying only on the sense of vision. We propose to do so using the history one has about their body postures while moving in the real world. This is done by implementing the peephole metaphor using a tablet pc as a window to the virtual world, artificially enlarging the display and interaction area. The tablet is held by the participant, who moves it around and point it in any direction Using Whole-Body Orientation for Virtual Reality Interaction Vitor A.M. Jorge, Juan M.T. Ibiapina, Luis F.M.S. Silva, Anderson Maciel, Luciana P. Nedel Instituto de Informática Universidade Federal do Rio Grande do Sul (ufrgs) {vamjorge, jmtibiapina, luisfmssilva, amaciel, nedel}@inf.ufrgs.br 268