Analyzing Effects of Geometric Rendering Parameters on Size and Distance Estimation in On-Axis Stereographics Gerd Bruder * , Andreas Pusch , Frank Steinicke Departments of Computer Science and Human-Computer-Media University of W¨ urzburg Abstract Accurate perception of size and distance in a three-dimensional vir- tual environment is important for many applications. However, sev- eral experiments have revealed that spatial perception of virtual en- vironments often deviates from the real world, even when the vir- tual scene is modeled as an accurate replica of a familiar physical environment. While previous research has elucidated various fac- tors that can facilitate perceptual shifts, the effects of geometric rendering parameters on spatial cues are still not well understood. In this paper, we model and evaluate effects of spatial transforma- tions caused by variations of the geometric field of view and the in- terpupillary distance in on-axis stereographic display environments. We evaluated different predictions in a psychophysical experiment in which subjects were asked to judge distance and size properties of virtual objects placed in a realistic virtual scene. Our results sug- gest that variations in the geometric field of view have a strong in- fluence on distance judgments, whereas variations in the geometric interpupillary distance mainly affect size judgments. CR Categories: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, Augmented, and Vir- tual Realities; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism—Virtual Reality Keywords: on-axis stereographics, distance perception, size per- ception, cue conflicts, head-mounted display 1 Introduction Modern virtual reality (VR) display technologies enable users to experience a three-dimensional virtual environment (VE) from an egocentric perspective. Such immersive viewing experiences have an enormous potential for a variety of application domains, in which accurate spatial perception during design, exploration or review of virtual models and scenes is required. Head-mounted displays (HMDs) and immersive projection technologies are often used in order to provide a user with near-natural feedback of virtual content, as if the user is present in the virtual scene. In particular, modern realtime rendering systems can create compelling immer- sive experiences offering most of the spatial visual cues we can find in real-world views, including perspective, interposition, lighting and shadows [Thompson et al. 2011]. However, distance and size perception are often biased in such environments, causing users to * e-mail: gerd.bruder@uni-wuerzburg.de e-mail: andreas.pusch@uni-wuerzburg.de e-mail: frank.steinicke@uni-wuerzburg.de over- or underestimate spatial relations in virtual scenes to a much higher magnitude than can be observed in similar scenes in the real world [Loomis and Knapp 2003; Thompson et al. 2004; Interrante et al. 2007]. Obviously, issues with visual rendering have been suggested as a potential source for biased spatial perception. In order for a virtual scene to be displayed stereoscopically on a binocular HMD, the computer graphics system must determine which part of the scene should be displayed where on the two screens. In 3D computer graphics, planar geometric projections are typically applied, which make use of a straightforward mapping of graphical entities within a 3D ‘view’ region, i. e., the view frustum, to a 2D image plane. During the rendering process, objects inside the view frustum are projected onto the 2D image plane; objects outside the view frustum are omitted. The exact shape of each view frustum in on-axis stere- ographic displays (as used in many HMDs) is a symmetric truncated rectangular pyramid. The opening angle at the top of the pyramid, often denoted as geometric field of view (GFOV) [McGreevy et al. 1985], should match the display’s field of view (DFOV) for the im- agery to be projected in a geometrically correct way [Steinicke et al. 2011a]. Another important characteristic of the human visual sys- tem is the interpupillary distance (IPD), which describes the hor- izontal separation of the eyes that ranges from 5.77cm to 6.96cm (median: 6.32cm) in adult males (according to Woodson [Woodson 1981]). Since the viewpoints of both eyes are horizontally sepa- rated, each eye receives a slightly different retinal image. The brain interprets the binocular inputs and fuses the images, resulting in the impression of a solid space and the perception of depth [Cutting 1997]. Typically, the user’s IPD is measured and then applied to the geometric interpupillary distance (GIPD) used for stereoscopic rendering, assuming that the HMD’s display units are correctly ad- justed in front of the user’s eyes. Both geometric rendering parameters, GFOV and GIPD, have to be defined in all on-axis 3D stereoscopic visualization systems. At the same time, they are particularly prone to calibration errors and therefore bear a high risk of accidentally skewing a user’s percep- tion in immersive VEs. Common sources for such errors are na¨ ıvely applying manufacturer specifications (e. g., the FOV of built-in dis- plays in head-mounted devices [Kuhl et al. 2009; Kellner et al. 2012]) without verification of the physical display characteristics, or by using population means to approximate a user’s IPD. Al- though slight errors in such rendering parameters are quite common in VR environments, it is still not entirely clear as to what effects these discrepancies have on distance and size cues. Moreover, it has been found that when users have direct control over a rendering pa- rameter, they often try to use it to compensate for given perceptual biases that may have been introduced by miscalibration of other pa- rameters [Steinicke et al. 2011b]. It remains a challenging question how rendering parameters are related regarding particular cues, and if they could be used to address perceptual biases. The motivation of this work is to compare mathematical models for selected cues that are dominated by the two rendering parameters GFOV and GIPD in terms of their mutual influence on size and distance perception in realistic VEs [Thompson et al. 2004; Inter- rante et al. 2007; Willemsen et al. 2009]. We describe the effects