Peek and Look: Accessing Off-Screen Targets Using Head Tracking Clayton Epp, Miguel Nacenta, Carl Gutwin, Regan Mandryk University of Saskatchewan, Department of Computer Science 176 Thorvaldson Building 110 Science Place Saskatoon, SK S7N 5C9, CANADA Tel: 1-306-966-4886 cce381@mail.usask.ca, nacenta@cs.usask.ca, gutwin@cs.usask.ca, regan@cs.usask.ca ABSTRACT Although screen displays are getting larger in size and more cost effective, data sets are becoming more complex and demand increasing amounts of space on the screen. Toolbars and menus occupy considerable space and compete with the data or document being viewed. To maximize the screen real estate available for data manipulation, we propose storing the tools off-screen. To access these off-screen tools we present two interaction techniques that allow the position of the head to change the viewport state (e.g. by panning or zooming), leaving the hands free for other tasks. In this preliminary work, we explore variations of these interaction techniques and challenges that arise in their implementation and use. ACM Classification: H5.m. [User Interfaces]: Interaction styles. General terms: Design, Human Factors Keywords: navigation, off-screen targets, head tracking. INTRODUCTION In many graphical applications such as geographic information systems (GIS) and fine-grained illustration, the workspace is drastically larger than the available viewport. Competing demands put a premium on screen space, in particular, between the data view and the space for the tools required. Tools occupy a lot of room; for example, in the image-editor application shown in Figure 1, the window layout allocates 23% of the screen for tools, leaving only 77% for data. A number of solutions for this problem exist: shortcut keys, transparent toolbars, configurable UI elements, virtual desktops, and focus+context techniques. However, these techniques are not without their shortcomings; they require expert knowledge (shortcuts), context switching (virtual desktops), obstruct interaction (transparent toolbars), or use valued screen space (configurable UI / focus + context techniques). We suggest that tools be stored off-screen, around the edges of the visible workspace. For seldom-used tools, off-screen storage may be simpler to access than hunting through hierarchical menu systems. The idea of storing tools off- screen means that applications can make use of a much larger workspace than the monitor provides. For example, extending the workspace in Figure 1 by 10% in all directions results in an addition that is almost half (44%) of the original workspace size. In order to access this content we propose two interaction techniques where viewport adjustments are achieved via head-tracking. Figure 1: Typical data / tool space relationship. RELATED WORK Some techniques use arrows and halos around the periphery of the screen to visualize off-screen targets [1]. Hop [4] combines numerous techniques to improve selection of off- screen targets. There has also been research in using just-off-screen eye tracking to facilitate reading tasks [5]; however the research focuses mainly on automatic scrolling of the viewport. They use eye-gaze and dwell time to access off-screen buttons, whereas we use head movement. Recent work has used physical proximity to control graphic enlargement as a method of natural interaction to improve readability on monitors [2]. However, this method focuses on on-screen content with user proximity awareness Copyright is held by the author/owner(s). UIST’08, October 1922, 2008, Monterey, California, USA ACM 978-1-59593-975-3/08/10.