Analysing the Effects of a Wide Field of View Augmented Reality Display on Search Performance in Divided Attention Tasks Naohiro Kishishita 1 , Kiyoshi Kiyokawa 2 , Jason Orlosky 4 , Tomohiro Mashita 5 , and Haruo Takemura 6 Osaka University 1-32 Machikaneyama, Toyonaka, Osaka 560-0043, Japan Ernst Kruijff 3 Institute of Visual Computing, Bonn-Rhein Sieg University of Applied Sciences Grantham-Allee 20, 53757 Sankt Augustin, Germany ABSTRACT A wide field of view augmented reality display is a special type of head-worn device that enables users to view augmentations in the peripheral visual field. However, the actual effects of a wide field of view display on the perception of augmentations have not been widely studied. To improve our understanding of this type of display when conducting divided attention search tasks, we conducted an in depth experiment testing various view management methods. Results show that depending on display method, search performance either drops or increases gradually up to 100 degrees of field of view. This suggests that a rapid turning point in performance exists at approximately 130 degrees of field of view. Results also indicate that users exhibited lower discovery rates for targets appearing in peripheral vision, and that there is little impact of field of view on response time and mental workload. Keywords: Augmented reality, see-through head mounted displays, peripheral visual field, information display methods. Index Terms: H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems—Artificial, augmented, and virtual realities; H.5.2 [Information Interfaces and Presentation]: User Interfaces—Ergonomics 1 INTRODUCTION Augmented reality (AR) is a vibrant field of research that has steadily grown over the last decade. Nevertheless, many researchers are still trying to solve fundamental issues. In particular, perceptual issues have recently gained interest, mainly due to the development of new interactive visualization techniques optimized for visual perception and understanding. These techniques have traditionally been geared towards narrow field of view (FOV) displays, which are quickly entering the mass market. Wide FOV displays, which are more prevalent in immersive virtual reality (VR) setups, are likely to follow once technical limitations have been overcome. Still, even for this particular medium only few experimental results have been reported that focus on the understanding of the underlying mechanisms behind the perception of virtual content. Enabling wide FOV in augmented reality displays has a number of benefits. AR see-through head mounted displays (HMDs) typically provide a 20-60 degrees of horizontal FOV [1], which is very narrow compared to the FOV of a human eye. The human eye enables vision within approximately 180 degrees horizontal and 125 degrees vertical. Humans rely heavily on the peripheral visual field [2], and limiting FOV greatly increases the difficulty of various visual tasks [3]. It can be expected that with a wider FOV, improved view management can be achieved since a wider screen provides more usable space, which can thereby reduce information clutter. In turn, this can considerably improve visibility, readability, and depth perception of labels [4]. However, to date, statistical and experimental evidence is not available to support these assumptions. In this publication, we take a look at perceptual issues from a different angle, by analysing how wide FOV displays affect perception of augmented virtual objects. To do so, we conducted a thorough analysis of task performance when searching for target labels. The study presented in this paper is intended to provide the first insights into how the effects of a wide FOV display can be beneficial for the design and optimization of user interfaces for this kind of display. At the moment, design decisions are often conducted in an ad-hoc manner, mostly based on what is known from eye physiology and related work in (semi-) immersive environments. We explore to what extent a wide FOV affects search task effectiveness, and look more closely into related attention and mental workload issues. Our study tackles these issues by monitoring user performance on a divided attention task. While solving a puzzle, users were asked to search for targets that are both within and outside their active FOV (the FOV containing augmentations), using two different types of view management. This task space is comparable to tasks generally encountered in AR navigation scenarios, where users also have to split their attention and concentration. This work picks up where our previous study on wide FOV displays left off [5], offering a concentration-intensive setting more closely resembling real-world application, as shown in Figure 1. The experiment presented here focuses on a divided attention task to simulate a more practical outdoor AR scenario where users are presented with spatial guidance information while performing a real time task. Furthermore, the task was conducted outdoors with a wide FOV optical see through HMD, and 1 kishishita.naohiro@lab.ime.cmc.osaka-u.ac.jp 2 kiyo@ime.cmc.osaka-u.ac.jp 3 ernst.kruijff@h-brs.de 4 orlosky@lab.ime.cmc.osaka-u.ac.jp 5 mashita@ime.cmc.osaka-u.ac.jp 6 takemura@ime.cmc.osaka-u.ac.jp Figure 1: Outdoor experiment setup showing the task interface, wide FOV display, and head tracking apparatus.