K.S. Candan and A. Celentano (Eds.): MIS 2005, LNCS 3665, pp. 102 – 114, 2005.
© Springer-Verlag Berlin Heidelberg 2005
Modeling Context in Haptic Perception, Rendering
and Visualization
Kanav Kahol, Priyamvada Tripathi, Troy McDaniel, and Sethuraman Panchanathan
Center for Cognitive Ubiquitous Computing,
Department of Computer Science and Engineering,
Arizona State University, Tempe, Arizona USA 85287
{kanav, pia, troy.mcdaniel, panch}@asu.edu
Abstract. Haptic perception refers to the human ability to perceive spatial
properties through tactile and haptic sensations. Humans have an uncanny
ability to analyze objects based only on sparse information from haptic stimuli.
Contextual clues about material of an object, its overall shape, size and weight
configurations perceived by individuals, lead to recognition of an object and its
spatial features. In this paper, we present strategies and algorithms to model
context in haptic applications that allow user to explore objects in virtual
reality/augmented reality, haptically. Our methodology is based on modeling
user’s cognitive and motor strategy of haptic exploration. Additionally we also
model physiological arrangement of tactile sensors in the human hand. These
models provide the context to adapt haptic displays to a user’s style of haptic
perception and exploration and the present state of the user’s exploration. We
designed a tactile cueing paradigm to test the validity of the contextual models.
Initial results show improvement in accuracy and efficiency of haptic
perception when compared to the conventional approaches that do not model
context in haptic rendering.
1 Introduction
The term ‘haptics’, derived from the word haptikos, means the ability to touch and it
generally includes both kinesthetic and tactile modality [13]. Both sighted and blind
individuals perceive spatial information through haptics. While the significance of
touch as a modality is controversial topic when studying sighted individual, it is a
widely accepted fact that individuals who are blind/deaf blind employ haptic
perception to develop spatial representations and that haptic sensation can lead to
spatial representations and forms an important part of the human sensory and
perceptual apparatus [13].
The desire for natural and intuitive human machine interaction has led to the
inclusion of haptics in human-computer interfaces. Such interfaces allow users to
provide input to a system through hand movements, and to receive haptic feedback
through vibrotactile stimulation of the hands. Haptic Joysticks, haptic mouse and
haptic gloves are examples of commercially available devices that can simulate force
and/or tactile feedback. While the potential for haptics in natural human machine
interaction is intriguing, the realization of practical interfaces has not yet been