FittsFace: Exploring Navigation and Selection Methods for Facial Tracking Justin Cuaresma and I. Scott MacKenzie ( ) Department of Electrical Engineering and Computer Science, York University, Toronto, ON, Canada justincuaresma@gmail.com, mack@cse.yorku.ca Abstract. An experimental application called FittsFace was designed according to ISO 9241-9 to compare and evaluate facial tracking and camera-based input on mobile devices for accessible computing. A user study with 12 participants employed a Google Nexus 7 tablet to test two facial navigation methods (posi‐ tional, rotational) and three selection methods (dwell, smile, blink). Positional navigation was superior, with a mean throughput of 0.58 bits/second (bps), roughly 1.5× the value observed for rotational navigation. Blink selection was the least accurate selection method, with a 28.7% error rate. The optimal combi‐ nation was positional+smile, with a mean throughput of 0.60 bps and the lowest tracking drop rate. Keywords: Camera input · Facial tracking · Blink selection · Smile selection · Fitts’ law · ISO 9241-9 · Qualcomm Snapdragon · Accessible input techniques 1 Background Smartphones and tablets are now an integral component of our daily lives, as new tech‐ nologies emerging regularly. Current devices include a media player, a camera, and sophisticated communications electronics. More specifically, buttons have given way to smooth touchscreen surfaces. Not surprisingly, this impacts user interaction. Touchscreen surfaces allow input via touch gestures such as taps and swipes. Additional sensors, such as gyroscopes and accelerometers, enable other forms of interaction (e.g., tilt) to enhance the user experience. One common sensor on smartphones and tablets that is rarely employed for user input is the front-facing camera. Although mostly used as a novelty, it is possible to adopt the front-facing camera to perform facial tracking. The front-facing camera is sometimes used in picture-taking applications to enhance or alter one’s facial features. An app called Reframe on the App Store, shown in Fig. 1 illustrates a typical example. The app places a pair of user-selected glasses on the user’s face, allowing him/her to virtually try on a pair to see how they look. Besides novelty apps, there are additional ways to use facial tracking. For games, facial tracking is presently used to maneuver in-game elements [1]. Accessible computing is another candidate. Android and iOS already integrate accessible function‐ alities via voice I/O (e.g., Siri); however, there is still no comprehensive mobile solution © Springer International Publishing AG 2017 M. Antona and C. Stephanidis (Eds.): UAHCI 2017, Part II, LNCS 10278, pp. 403–416, 2017. DOI: 10.1007/978-3-319-58703-5_30