EXMAR: EXpanded view of Mobile Augmented Reality Sungjae Hwang, Hyungeun Jo, and Jung-hee Ryu Graduate School of Culture Technology KAIST Figure 1. EXMAR supports exploration of off-screen view. (a) user’s view when using EXMAR (b) magnification of (a) when using EXMAR. User can explore off-screen point of interests with environmental contextual information by simple dragging gesture. Green dot represents augmented tags ( looking through the sequence of images from left to right ) ABSTRACT There have been many studies to minimize the psychological and physical load increase caused by mobile augmented reality systems. In this paper, we propose a new technique called “EXMAR”, which enables the user to explore his/her surroundings with an expanded field of view, resulting in a decrease of physical movement. Through this novel interaction technique, the user can explore off-screen point of interests with environmental contextual information by simple dragging gestures. To evaluate this initial approach, we conducted a proof of concept usability test under a set of scenarios such as “Exploring objects behind the user”, “Avoiding the invasion of personal space” and “Walk and type with front-view.” Through this initial examination, we found that users can explore off-screen point of interests and grasp the spatial relations without the increase of mental effort. We believe that this preliminary study gives a meaningful indication that employing the interactive field of view can be a useful method to decrease the physical load without any additional mental efforts in a mixed and augmented reality environment. KEYWORDS: Augmented Reality, Mixed Reality, Distortion Correction, Expanded Field Of View (EFOV), Fish-eye lens, interaction INDEX TERMS: H.5.1. [Information System]: Multimedia information systems — Artificial, augmented, and virtual realities 1 INTRODUCTION Augmented Reality (AR) is the system that combines real and computer-generated information in the real world [5]. Over the past few years, as mobile devices have grown rapidly, browsing information in mobile augmented reality has become a widely improved practice. However, browsing around to find points of interest with handheld devices carries a number of drawbacks. A major problem is the limited field of view and no support for stereo, resulting in less depth cues [1]. For instance, the iPhone 3GS has much less Field Of View (FOV of 38.7 horizontally and 50.1 degrees vertically) comparing to the naked eye (FOV of about < 180 degrees). This disadvantage also leads to an increase of physical movement. To overcome this, a lot of studies have focused on showing spatial cues for off-screen points of interest. However, these methods cannot convey environmental relations between computer-generated tags and real objects to the user. Furthermore, occlusion by overview occurs. Another problem is the “invasion of personal space” when using mobile devices to get someone’s information in the AR environment. Recently, face recognition based prototypes to get augmented ID was introduced. However, this scenario depends on directed and gaze-based scanning of the face at a personal distance. We believe that it is not socially acceptable to face mobile devices towards unknown people directly in real life. To overcome these two key issues, we propose a new interaction method EXMAR, which consists of two novel techniques: The first is an expanded and undistorted field of view from fisheye lens. The second is an interaction method to dynamically change the field of view on the screen. 2 RELATED WORK It is challenging to display off-screen or occluded points of interest in augmented reality [3]. To address this issue, a number best@kaist.ac.kr acid@kaist.ac.kr ryu@business.kaist.ac.kr