Marco Agus
magus@crs4.it
Andrea Giachetti
giach@crs4.it
Enrico Gobbetti
gobbetti@crs4.it
Gianluigi Zanetti
zag@crs4.it
Antonio Zorcolo
zarco@crs4.it
CRS4
VI Strada Ovest
Z. I. Macchiareddu
I-09010 Uta (CA), Italy
www.crs4.it
Presence, Vol. 12, No. 1, February 2003, 110–122
© 2003 by the Massachusetts Institute of Technology
Real-Time Haptic and Visual
Simulation of Bone Dissection
Abstract
Bone dissection is an important component of many surgical procedures. In this
paper, we discuss a haptic and visual simulation of a bone-cutting burr that is being
developed as a component of a training system for temporal bone surgery. We use
a physically motivated model to describe the burr-bone interaction, which includes
haptic forces evaluation, the bone erosion process, and the resulting debris. The
current implementation, directly operating on a voxel discretization of patient-spe-
cic 3D CT and MR imaging data, is efcient enough to provide real-time feedback
on a low-end multiprocessing PC platform.
1 Introduction
Bone dissection is an important component of many surgical procedures.
In this paper, we discuss a real-time haptic and visual implementation of a
bone-cutting burr that is being developed as a component of a training simula-
tor for temporal bone surgery. The specic target of the simulator is mastoid-
ectomy, a very common operative procedure that consists in the removal, by
use of the burring tool, of the mastoid portion of the temporal bone. The im-
portance of computerized tools to support surgical training for this kind of
intervention has been recognized by a number of groups that are currently
developing virtual reality simulators for temporal bone surgery (Wiet et al.,
2000; Pesser, Petersik, Tiede, Hohne, & Leuwer, 2000). Our work is charac-
terized by the use of patient-specic volumetric object models directly derived
from 3D CT and MRI images, and by a design that provides realistic visual
and haptic feedback, including secondary effects such as the obscuring of the
operational site due to the accumulation of bone dust and other burring de-
bris. The need to provide real-time feedback to users while simulating burring
and related secondary effects imposes stringent performance constraints. Our
solution is based on a volumetric representation of the scene, and it harnesses
the locality of the physical system evolution to model the system as a collection
of loosely coupled components running in parallel on a multiprocessor PC
platform. Previous work has demonstrated the effectiveness of voxel-based rep-
resentations for the generation of force feedback in the case of rigid body envi-
ronments (McNeely, Puterbaugh, & Troy, 1999), virtual clay models (Avila &
Sobierajski, 1996; Galyean & Hughes, 1991; Wang & Kaufman, 1995; He &
Kaufman, 1997), or deformable bodies (Cotin, Delingette, & Ayache, 1996;
Gibson et al., 1998; Frisken-Gibson, 1999; James & Pai, 2001).
This article, an extended version of our IEEE Virtual Reality 2002 contribu-
tion (Agus, Giachetti, Gobbetti, Zanetti, & Zorcolo, 2002a), focuses on the
110 PRESENCE: VOLUME 12, NUMBER 1