Robotic Assisted Micromanipulation System using
Virtual Fixtures and Metaphors
Mehdi Ammi
*
, Antoine Ferreira
†
*
LIMSI-CNRS, Universit´ e de Paris Sud, France
Email: mehdi.ammi@limsi.fr
†
Laboratoire Vision et Robotique
ENSI de Bourges - Universit´ e d’Orl´ eans, France
Email: antoine.ferreira@ensi-bourges.fr
Abstract— This paper describes the use of virtual fixtures and
metaphors of assistance for robotic-assisted micromanipulation
system in order to prevent the influence of microphysics on
path planning and handling tasks. The system is based on a
multimodal telemanipulation system using haptic/visual/sound
interfaces for observation of microobjects under an optical mi-
croscope. Feasible haptically-generated paths based on potentials
fields reaction forces and shock absorbers are described for
efficient and safe pushing-based or adhesion-based micromanip-
ulation. Then, metaphors with human sensory substitution are
proposed in order to improve the perception of data or events.
Finally, an experimental investigation carried out by nine trainees
proves that the system guides efficiently and safely the operator’s
gesture. Moreover, user performance on a given task can increase
as much as 52% in typical micromanipulation tasks.
I. I NTRODUCTION
In microscale manipulation, current telerobotic tasks require
that the human performs high-precision, repeatable and safe
operations in confined environments. Some examples can be
found typically in microelectromechanical (MEMS) assembly
systems [1] or in the injection of substances (DNA,RNA)
in biological cells [2]. Currently, such tasks are performed
under an optical microscope where forces are imperceptible
and depth measurement limited. Tremor, fatigue, and stress
are magnified which affects the accuracy and efficiency of
the micromanipulation tasks. Vision-based virtual fixtures can
overcome human limitations by providing guidance and assis-
tance tools to robot-assisted micromanipulation tasks [3],[4].
In the field of micromechatronics, Song et al. [5] proposed a
telemicromanipulation system assisted by augmented reality.
Visual virtual guides are used for enhancing the visibility and
perception of the operator performing microassembly tasks. In
the domain of biology and surgery, Kumar et al. [6] experi-
mented a Steady Hand robotic system (SHR) for vitreoretinal
microsurgery where guidance virtual fixtures improved the
speed and efficacy of the procedure. Based on the SHR system,
Kapoor et al. [7] proposed also the use of vision-based virtual
fixtures in the force control for safe biological microinjection
tasks.
Consequently, this approach overcomes inadequate preci-
sion control over motion and force in freehand procedures. The
virtual fixtures can restrict motion in given directions and/or
planes in order to guide motion towards specific locations
[8]. They permit the operator to perform tasks with higher
confidence and accuracy with the knowledge that the typical
limitations of human skill at the microscale have been largely
overcome. Studies have shown that user performance on a
given task can increase as much as 70% after the introduction
of virtual fixture guidance [9]. Virtual fixtures can be designed
to have different levels of motion guidance, ranging from
complete free guidance (hard fixture), limited guidance (soft
fixture) and no guidance. Generally, most of the micromanipu-
lation tasks requires a mixture of these three types of fixtures.
As example, in a microassembly robotic task different fixtures
are required following the task decomposition: (i) avoidance of
obstacles (no guidance), a path following mode (soft fixture)
and an insertion mode (hard fixture) [10]. In this study, several
virtual fixtures are proposed, experimented and characterized.
In Section 2, we describe an multimodal human-machine
interface based on virtualized reality techniques for real-time
telemicromanipulation with vision, force and sound feedback.
Then, different virtual fixtures are proposed in Section 3 for
operator guidance and assistance during micromanipulation
tasks. Finally, Section 4 presents a series of experiments to
validate the proposed virtual haptic fixtures.
II. MUTISENSORY TELEMICROMANIPULATION SYSTEM
Fig. 1. Architecture of the multisensory telemicromanipulation system.
Fig.1 shows a multisensory human-machine interface (HMI)
system connected to an AFM-based micromanipulator working
through the field of view of an optical microscope. In this
2007 IEEE International Conference on
Robotics and Automation
Roma, Italy, 10-14 April 2007
WeB3.3
1-4244-0602-1/07/$20.00 ©2007 IEEE. 454