OMG! - A new robust, wearable and affordable Open source Mobile Gaze tracker Kristian Lukander Sharman Jagadeesan Huageng Chi Kiti Müller Brain Work Research Centre, The Finnish Institute of Occupational Health Topeliuksenkatu 41 a A, FIN-00250 Helsinki, Finland corresponding author: kristian.lukander@ttl.fi ABSTRACT We present a novel, robust, affordable and wearable, mobile gaze tracker. The tracker takes a model-based approach to tracking gaze and maps the calculated gaze on to a scene video. The system is built from standard off-the- shelf components, and is the first to our knowledge using a 3D printed frame. The system will be published as open source, and the total cost of the components for building the system is 350€. The model-based tracking provides a solution robust to changing lighting conditions and frame slippage on the head of the user. Author Keywords gaze tracking; eye tracking; mobile; wearable ACM Classification Keywords H.5.1.[User interfaces]:Gaze interfaces; H.5.2[Input devices and strategies]; I.4.9.[Image Processing Applications] General Terms Human Factors; Measurement INTRODUCTION While gaze tracking studies have a long history within controlled laboratory environments and tracking gaze on fixed computer screens delivering stimuli, current trends of HCI research demand for a more mobile approach applicable to field studies, tracking gaze with mobile devices and in actual operational environments. For reviews on eye tracking development, see e.g. Hayehoe and Ballard's work [9]. Hansen [8] presents a survey of models for tracking gaze and eye features and Duchowski [3] delivers a broad survey of eye tracking applications. Evans et al. [5], focus on outdoor gaze tracking and some of the complexities inherent in taking gaze tracking out of the lab. Constructing a wearable gaze tracker poses three fundamental problems: a) the sensitivity to changing lighting making the detection of tracked eye features challenging, b) the error introduced by frame slippage in relation to the head/eye of the user, and c) the typical complexity of user calibration. Earlier open source gaze tracker systems include the openEyes system [12] that introduced the popular Starburst algorithm for detecting eye features; the ITU Gaze Tracker system [18] that aims at providing a low-cost alternative to commercial gaze trackers; the Haytham [15], developed more toward direct gaze interaction in real environments; and the wearable system by Ryan et al. [17] aiming at operating under visual light conditions. We have developed a novel wearable gaze tracking system, taking a model-based approach to localizing the eye and the gaze vector in 3D, making the system considerably robust against slippage of the headgear and requiring only simple user calibration. The system is built from affordable off- the-shelf components, a custom printed circuit board (PCB) with standard electrical components, and as a novel development, our system is the first to our knowledge that uses a 3D-printed frame for the headgear. The tracking and calibration software, and the frame and PCB designs will be released as open source under a GNU General Public License. Apart from a standard laptop computer for running the software, the total cost for the system is about 350€. Here we explain the design rationale for the system, introduce the operational principles, the geometry and the hardware, describe the modular software design, and present the first tracking results. OBJECTIVES AND DESIGN RATIONALE Typical barriers standing in the way of more wide-spread use of gaze tracking include the intrusive nature of tracking devices, complex issues with robustness and calibration under varying operating conditions, price and availability, and limited use in natural environments. Proprietary solutions also somewhat limit access to the measurement principles and technical solutions. Hansen [8] lists a number of features for designing future gaze trackers, such as development of head mounted trackers for accurate and flexible measurements in natural surroundings, flexible setups for adaptable tracker designs, limiting calibration to simplify experimental setups and add robustness, and decreasing the cost of the systems. We would suggest extending the list with modular designs and - 1 - Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions@acm.org. MobileHCI '13, August 27 - 30 2013, Munich, Germany Copyright is held by the owner/author(s). Publication rights licensed to ACM. ACM 978-1-4503-2273-7/13/08…$15.00. http://dx.doi.org/10.1145/2493190.2493214