C. Stephanidis (Ed.): Universal Access in HCI, Part I, HCII 2007, LNCS 4554, pp. 1040–1049, 2007.
© Springer-Verlag Berlin Heidelberg 2007
Usability Design of a Scanning Interface for a Robot
Used by Disabled Users
Anthony S. White
1
and Stephen Prior
2
1
School of Computing Science, Middlesex University
The Burroughs, Hendon, London, NW4 4BT
2
Product Design and Engineering, Middlesex University, Bramley Rd, Trent Park,
Enfield, London, N14 4YZ
Abstract. The results of examining a scanning user interface implementation
with command inputs in the form of head gestures for a rehabilitation robot
using Fitts’ law variations and comparing these with a servo eye tracking model
are made. Calculations show that the movement time prediction is more
accurate in this case using the servo eye model. The response from the
linearised eye model predicts that there is a minimum scanning distance that
can be used and minimum spacing between commands display.
Keywords: scanning user interface, Servo-eye-model, Fitts’ law, rehabilitation,
robotics, gestures.
1 Introduction
This work results from the charity Aspire supported research programme [1] to
develop a cheap robot arm to attach to a wheelchair for assistive technology for
paraplegic users. Other workers have examined the use of computer interfaces for
disabled users [2], [3]. Keates et al. [2] found quite significant differences in response
times for impaired users as well as noisy responses while Lesher et al. [3] found it
was possible to allow for anticipation.
Two methods for presenting robot language commands to the disabled user were
employed by us. The first presented commands in the form of a flat scanning menu
system, and was used during initial development and evaluation of the Middlesex
Manipulator. The second presentation form employed a Microsoft Windows dialog
based graphical user interface. This allowed all control options to be presented
simultaneously, allowing for faster task completion. However, the interface required the
user to be fairly competent when using a mouse or trackball. This requirement led to the
development of a 'Head Mouse' and voice control. These proved to be better suited to
use with the flat menu system. A moving bar scanned the menu from left to right and
the user responded when the bar was in line with the chosen command. These
commands were translated to motor commands using a special robot language JUVO.
Fig. 1. Simple scanning system