This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS 1
Automatic Lock of Cursor Movement: Implications
for an Efficient Eye-Gaze Input Method for
Drag and Menu Selection
Atsuo Murata and Waldemar Karwowski, Senior Member, IEEE
Abstract—This study proposed a method—automatic lock of
cursor movement (ALCM)—that locks a cursor at the center of
a target at the instant the cursor enters the target. The method
is intended to suppress irritating subtle cursor movements that
occur when an eye-gaze input system transforms involuntary eye
movement (e.g., drift) into cursor coordinates. The effectiveness
of the proposed ALCM was verified using pointing performance
(speed and accuracy) in two types of HCI tasks. In a drag task,
we compared mouse input versus eye-gaze input with use of a
backspace (BS) key or voice input. The key or voice facilitates
target selection once the eye gaze was aligned with a target. In a
menu selection task, we also compared mouse input with eye-gaze
and the use of two voice input conditions. This task required gaze
alignment with a menu and menu item by use of voice input for
selection. Whether the ALCM function was added to the eye-gaze
input system or not was a within-subject factor. The input method
and target sizes were within-subject factors. The study concluded
that the ALCM improved pointing accuracy for all eye-gaze input
methods and all two tasks.
Index Terms—Automotive lock of cursor movement (ALCM),
click, drag, HCI, involuntary eye movement, menu selection, point-
ing time, prediction accuracy.
I. INTRODUCTION
S
INCE the initial proposal of an eye-gaze input system using
an eye tracking system [1]–[3], enhancements in the accu-
racy and resolution of the eye tracker have enabled researchers
to design HCI tasks using eye-gaze input systems that are faster
and more intuitive than a mouse [4]–[7]. HCI techniques that
incorporate eye movements into a human–computer dialogue
have found that eye-gaze input systems ensure faster pointing
[8]–[15]. Sibert and Jacob [5] and Murata [6] observed faster
target acquisition performance using eye-gaze with short (less
than 150 ms) dwell times than using a mouse. Agustin et al.
[16] suggested that an eye-gaze input would be promising even
in game interactions, provided that the eye tracker is sufficiently
accurate and responsive and that a well-designed interface is
available.
Manuscript received July 26, 2018; revised November 13, 2018; accepted
November 18, 2018. This paper was recommended by Associate Editor S.
Landry. (Corresponding author: Atsuo Murata.)
A. Murata is with the Department of Intelligent Mechanical Systems, Gradu-
ate School of Natural Science and Technology, Okayama University, Okayama
700-8530, Japan (e-mail:, murata@iims.sys.okayama-u.ac.jpand).
W. Karwowski is with the Engineering and Management Systems, University
of Central Florida, Orlando, FL 32816 USA (e-mail:, wkar@ucf.edu).
Digital Object Identifier 10.1109/THMS.2018.2884737
Previous studies have also considered an optimal and effec-
tive click method [14], [17], a menu selection method [18], and
a character input method [19]. According to Bader and Bey-
erer [7], these techniques diverge from natural gaze behavior to
trigger events such as click, drag or menu selection, which are
frequently used in a variety of HCI tasks. Although movements
corresponding to a mouse’s cursor movements can be executed
naturally by moving the eye line of gaze from point to point,
the left click function used in mouse interfaces, for example,
forces participants to carry out unnatural eye movements, such
as constant-duration fixation instead of the mouse’s left click.
Bader and Beyerer [7] emphasized the importance of achieving
more natural eye movement in order to enhance an eye-gaze
input system’s usability.
A number of drawbacks to the input system need to be ad-
dressed before it can be put to practical use in actual HCI tasks.
In particular, drift or jittering of the cursor caused by involuntary
eye movements during fixation, which occurs when transform-
ing the location of the eye gaze into a cursor, can be irritating
to the users of an eye-gaze input system who are attempting to
click, drag or perform menu selection tasks. The studies cited
above [8]–[15], [17]–[19] did not take this issue into account.
This problem may undermine the potential for the widespread
use of eye-gaze input systems in HCI tasks.
It is impractical to rely only on eye-gaze inputs when execut-
ing relatively complicated tasks such as menu selection, even
though, as Bader and Beyerer [7] suggested, using only eye-
gaze is more natural. One solution to this problem may be to
combine eye-gaze input with voice input or key pressing. How-
ever, this technique also has its disadvantages. With an eye-gaze
input system, involuntary eye movement occurs when executing
fixation and uttering for voice input or executing key presses.
This produces a subtle fluctuation of the cursor. This makes
it difficult to concentrate on gazing at a target, and the cursor
unintentionally moves out of the target. Therefore, the cursor
movement corresponding to the eye line of gaze (fixation point)
may unintentionally deviate from the intended target. Such de-
viation degrades the accuracy and speed of pointing. This would
be attributable to the interference between the perceptual and
visual system and the muscular or auditory system for keys or
utterances [20], [21], which in turn would produce inaccurate
and slow pointing.
Partala et al. [20] studied the benefit of comparing the com-
bination of gaze pointing and facial-muscle EMG clicking to
2168-2291 © 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications standards/publications/rights/index.html for more information.