I See What You’re Saying: A Literature Review of Eye Tracking
Research in Communication of Deaf or Hard of Hearing Users
Chanchal Agrawal
School of Information
Rochester Institute of Technology
chaanchalagrawal@gmail.com
Roshan Peiris
School of Information
Rochester Institute of Technology
roshan.peiris@rit.edu
ABSTRACT
Deaf or hard-of-hearing (DHH) individuals heavily rely on their
visual senses to be aware about their environment, giving them
heightened visual cognition and improved attention management
strategies. Thus, the eyes have shown to play a signifcant role in
these visual communication practices and, therefore, many various
researches have adopted methodologies, specifcally eye-tracking,
to understand the gaze patterns and analyze the behavior of DHH
individuals. In this paper, we provide a literature review from 55
papers and data analysis from eye-tracking studies concerning hear-
ing impairment, attention management strategies, and their mode
of communication such as Visual and Textual based communication.
Through this survey, we summarize the fndings and provide future
research directions.
CCS CONCEPTS
• Human-centered computing → Accessibility theory, con-
cepts and paradigms.
KEYWORDS
Deaf or Hard of Hearing, eye tracking, eye gaze, communication,
attention
ACM Reference Format:
Chanchal Agrawal and Roshan Peiris. 2021. I See What You’re Saying: A
Literature Review of Eye Tracking Research in Communication of Deaf or
Hard of Hearing Users. In The 23rd International ACM SIGACCESS Conference
on Computers and Accessibility (ASSETS ’21), October 18–22, 2021, Virtual
Event, USA. ACM, New York, NY, USA, 13 pages. https://doi.org/10.1145/
3441852.3471209
1 INTRODUCTION
Human eyes are the windows into our mind and we can understand
an individual’s behavior by analyzing their gaze patterns. Our eyes
don’t just help to see the world around us but also play an important
role in complex cognitive processing without requiring conscious
efort. Eye contact is an important part of social interactions and
gaze patterns have set the base to gain insight into autism [70].
Research on eye movement and eye monitoring fourished in the
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for proft or commercial advantage and that copies bear this notice and the full citation
on the frst page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specifc permission
and/or a fee. Request permissions from permissions@acm.org.
ASSETS ’21, October 18–22, 2021, Virtual Event, USA
© 2021 Copyright held by the owner/author(s). Publication rights licensed to ACM.
ACM ISBN 978-1-4503-8306-6/21/10. . . $15.00
https://doi.org/10.1145/3441852.3471209
1970s, with considerable advancement in both eye-tracking litera-
ture and psychological theory so as to connect eye-tracking data
with cognitive processes [53]. Eye-tracking heavily contributes
to the feld of Human-Computer Interaction (HCI) by playing im-
portant roles such as an input-method for interaction, an analysis
tool for usability testing, and a way to understand human behav-
ior in natural and controlled environments. Thus, with the help
of eye-tracking technologies we can understand many cognitive
processes, interpret an individual’s emotional state, and answer
complex behavioral patterns.
Studies have shown that along with verbal cues, humans tend
to unconsciously identify non-verbal cues to establish perceptions
about others [60, 78]. Blink and head nods have been found to hold
huge importance in indicating engagement in social interactions.
Gupta, et al. performed an eye-tracking study which indicated that
participants had more empathy in a face-to-face conversation and
tended to synchronize their eye blinks and head nods [45]. Simi-
larly, Nakano et al. found that if the mouth and eyes of a speaker
were visible in a video then the listeners would synchronize their
eye blinks with the speaker’s eye blinks [90]. Results from other
studies [52] also indicate that listeners’ blinks are often interpreted
as communicative signals and directly infuence the communica-
tive actions of speakers. Research by Fred Cummins have tried to
fgure out the relationship between gaze and blinks in dyadic con-
versations [33]. Studies performed by Sandgren O. et al shows that
when a task was to be performed between two people, the executor
spends nearly 90% of the time focusing his eyes on the mission,
10% on the director’s face, and less than 0.5% elsewhere [103]. The
listener looks more at the speaker than the other way around, how-
ever, at key points the speaker, when speaking, seeks an answer by
looking at the listener, producing a brief time of shared gaze [9].
Weiss has shown that eye movements play an important role in
providing non-verbal cues and various aspects of conversation such
as turn-taking or attentiveness, which appear to be directly tied
to an individual’s eye movements [121]. In both, one-on-one as
well as group conversations, the eye gaze of the audience plays an
important role in defning conversational turn-taking speaker from
the audience.
Deaf or Hard of Hearing (DHH) individuals have heightened
visual senses compared to the hearing individuals, however, that
is best revealed when attention is considered [10]. Studies show
that DHH individuals use their visual senses more to compensate
for the hearing limitations [10] and to understand, interpret, and
communicate with the world around them. One study [74] has in-
dicated the connection between cross-modal cortical recruitment
and visual capacity in congenitally deaf cats, a phenomenon where
cross-modal reorganisation to visual senses occurs in the absence