Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback Frank Steinicke ∗ , Gerd Bruder ∗ , Luv Kohli † , Jason Jerald † , and Klaus Hinrichs ∗ ∗ Visualization and Computer Graphics (VisCG) Research Group Department of Computer Science Westf ¨ alische Wilhelms-Universit ¨ at M ¨ unster, Germany {fsteini,g brud01,khh}@uni-muenster.de † Effective Virtual Environments (EVE) Group Department of Computer Science University of North Carolina at Chapel Hill, USA {luv,jjerald}@cs.unc.edu ABSTRACT Traveling through immersive virtual environments (IVEs) by means of real walking is an important activity to increase naturalness of VR-based interaction. However, the size of the virtual world of- ten exceeds the size of the tracked space so that a straightforward implementation of omni-directional and unlimited walking is not possible. Redirected walking is one concept to solve this problem of walking in IVEs by inconspicuously guiding the user on a phys- ical path that may differ from the path the user visually perceives. When the user approaches a virtual object she can be redirected to a real proxy object that is registered to the virtual counterpart and pro- vides passive haptic feedback. In such passive haptic environments, any number of virtual objects can be mapped to proxy objects hav- ing similar haptic properties, e.g., size, shape and texture. The user can sense a virtual object by touching its real world counterpart. Redirecting a user to a registered proxy object makes it necessary to predict the user’s intended position in the IVE. Based on this tar- get position we determine a path through the physical space such that the user is guided to the registered proxy object. We present a taxonomy of possible redirection techniques that enable user guid- ance such that inconsistencies between visual and proprioceptive stimuli are imperceptible. We describe how a user’s target in the virtual world can be predicted reliably and how a corresponding real-world path to the registered proxy object can be derived. Keywords: Virtual Reality, Locomotion Interface, Generic Redi- rected Walking, Dynamic Passive Haptics 1 I NTRODUCTION Walking is the most basic and intuitive way of moving within the real world. Keeping such an active and dynamic ability to navi- gate through large-scale immersive virtual environments (IVEs) is of great interest for many 3D applications demanding locomotion, such as in urban planning, tourism, 3D entertainment etc. Head- mounted display (HMD) and tracking system represent typical in- strumentation of an IVE. Although many domains are inherently three-dimensional and advanced visual simulations often provide a good sense of locomotion, most applications do not support VR- based user interfaces, least of all real walking is possible [33]. However, real walking in IVEs can be realized. An obvious ap- proach is to transfer the user’s head movements to changes of the virtual camera in the IVE by means of a one-to-one mapping. This technique has the drawback that the user’s movements are restricted by the limited range of the tracking sensors and a rather small workspace in the real world. Therefore concepts for virtual locomo- tion interfaces are needed that enable walking over large distances in the virtual world while remaining within a relatively small space in the real world. Many hardware-based approaches have been presented to ad- dress this issue [1, 15, 16, 26]. Since most of them are very costly and support only walking of a single user they may not get be- yond a prototype stage. However, cognition and perception re- search suggests that more cost-efficient alternatives exist. Psy- chologists have known for decades that vision usually dominates proprioceptive, i. e., vestibular and kinesthetic, sensation when the two disagree [7]. While graphics may provide correct visual stim- uli of motion in the IVE, it can only approximate proprioceptive stimuli. Experiments demonstrate that the user tolerates a certain amount of inconsistency between visual and proprioceptive sensa- tion [28, 32, 17, 22, 18, 4, 24]. Moreover users tend to unwittingly compensate for small inconsistencies making it possible to guide them along paths in the real world which differ from the path per- ceived in the virtual world. This so-called redirected walking en- ables users to explore a virtual world that is considerably larger than the tracked lab space [24] (see Figure 1 (a)). Besides natural navigation, multi-sensory perception of an IVE increases the degree of presence [10]. Whereas graphics and sound rendering have matured so much that realistic synthesis of real world scenarios is possible, generation of haptic stimuli still re- presents a vast area for research. Tremendous effort has been un- dertaken to support active haptic feedback by specialized hardware which generates certain haptic stimuli [5]. These technologies such as force feedback devices can provide compelling haptic feedback, but are expensive and limit the size of the user’s working space due to devices and wires. A simpler solution is to use passive haptic feedback: physical props registered to virtual objects provide real haptic feedback to the user. By touching such a prop the user gets the impression of interacting with an associated virtual object seen in an HMD [19] (see Figure 1 (b)). Passive haptic feedback is very compelling, but a different physical object is needed for each virtual object requiring haptic feedback [9]. Since the interaction space is constrained, only a few physical props can be supported, thus the number of virtual objects that can be touched by the user is lim- ited. Moreover, the presence of physical props in the interaction space prevents exploration of other parts of the virtual world not represented by the current physical setup. Thus exploration of large scale environments and support of passive haptic feedback seem to