T
o date, most human–computer interactive
systems have focused primarily on the
graphical rendering of visual information and, to a less-
er extent, on the display of auditory information. Among
all senses, the human haptic system provides unique and
bidirectional communication between humans and their
physical environment. Extending the frontier of
human–machine interaction, haptic interfaces—or force
feedback devices—have the potential to increase the
quality and capability of human–computer interaction
by exploiting our sense of touch and ability to skillfully
manipulate objects. The direct physical interaction with
computer-generated objects enabled by haptic interfaces
provides a useful and intuitive augmentation to visual
display and the opportunity to enhance the under-
standing of, and interaction with, complex data sets.
Several novel applications already effectively use haptic
technologies; these include molecular docking, nano-
material manipulation, surgical training, virtual proto-
typing, and digital sculpting.
Compared to visual and auditory display, haptic ren-
dering has extremely demanding computational
requirements. To maintain a stable system while dis-
playing smooth and realistic forces and torques, haptic
update rates of 1 kHz or more are typical. Haptics pre-
sents new challenges in the development of novel data
structures to encode shape and material properties, as
well as new techniques for data processing, analysis,
physical modeling, and visualization. This special issue
examines some of the latest advances on haptic render-
ing and applications, and provides an introductory view
of the challenges and opportunities in the field.
This issue
The first article, a tutorial by Salisbury, Barbagli, and
Conti, provides an overview of the haptics field with a
particular focus on the architecture of haptic rendering
systems and the devices that enable force feedback-
based haptic interaction. Unlike graphic and auditory
rendering, the rendering of haptic interactions requires
modeling the physical interactions between objects and
generating forces that arise during contact and motion
in real time. As in the field of computer graphics, prac-
titioners of computer haptics are intensely interested in
the best rendering methods of objects, so that humans
can meaningfully perceive them. This topic alone is a
vast area that will occupy researchers for years to come.
The first article gives a basic view of methods used to
render the way an object feels—it discusses modeling
these physical interactions and the dynamic stability
issues that arise in real-world implementations.
In the design of haptic systems, we need to consider
the sensory, perceptual, and cognitive abilities and lim-
itations of humans. In the second article, Hale and Stan-
ney present a review of physiological and psychophysical
aspects of human cutaneous and kinesthetic senses, fol-
lowed by a discussion of issues related to incorporating
haptic interaction with a graphical display. The authors
present several design guidelines for developing multi-
modal interaction systems. Their objectives were to iden-
tify conditions under which haptic interaction might
enhance human perception and performance. By com-
bining neurological and behavioral research methods,
they evaluate various sensory integration methods (for
example, ramp-up patterns or timing) for better design
of haptic interaction systems.
A key area in haptics receiving increased attention is
the rendering of surface texture. Surface texture typi-
cally refers to microgeometric features on object sur-
faces in haptic rendering. Intrinsic surface properties
like textures are among the most salient haptic charac-
teristics of objects. The third article by Choi and Tan pre-
sents a survey on systematic studies of issues that
contribute to the perceived instability of haptic texture
rendering. The authors conduct psychophysical exper-
iments to investigate conditions under which perceived
instability of virtual texture occurs and the type of per-
ceived instability frequently reported by users. By ana-
lyzing the measured data, they identify the proximal
stimuli that caused the perceived instability and indi-
cate the sources that produce the stimuli.
The next article by Mahvash and Hayward describes
an efficient method to synthesize the nonlinear haptic
response of deformable models from prerecorded sim-
Guest Editors’ Introduction
Ming Lin
University of North Carolina, Chapel Hill
Kenneth Salisbury
Stanford University
Haptic Rendering—
Beyond Visual
Computing
22 March/April 2004 Published by the IEEE Computer Society 0272-1716/04/$20.00 © 2004 IEEE