Haptic Guidance of Light–Exoskeleton for Arm–Rehabilitation Tasks
Luis I. Lugo-Villeda, Antonio Frisoli, Oscar Sandoval–Gonz´ alez, Miguel A. Padilla,
Vicente Parra-Vega, Carlo A. Avizzano, Emanuele Ruffaldi and Massimo Bergamasco.
Abstract— Fixed–Base Exoskeleton applications have in-
creased rapidly in the last few years, evidently as part of
promising rehabilitation robotic programs of the robotics
worldwide community, where in particular Human–Robot–
Interaction (HRI) plays an important role in its design and
control because they are tightly coupled to human–limbs.
Exoskeletons embrace HRI as well as technological and the-
oretical challenges towards real and effective rehabilitation. In
this realm, some questions arise, to name a few, what is the
relationship between the exchanged energy between human and
exoskeleton? How can we assess rehabilitation factors under
HRI philosophy? This paper attempts to establish answers to
these questions, which can be embodied into rehabilitation HRI
using a Light–Exoskeleton. A compliant haptic guidance scheme
for human arm subject to minimum–jerk–trajectories criterion
is proposed. Preliminary experimental results provide further
insight of a haptic guidance scheme taking into account decisive
factors into the HRI such as human pose, haptic guidance
control, reaching and tracking tasks, the complexity of the
virtual environment, and muscles activity.
I. I NTRODUCTION
In last few decades, robot–aided–therapy has provided
high performance so as to gain acceptance into the rehabilita-
tion area [2] since the new trend on human–robot–interaction
(HRI) furnishes more efficient tools to carry out the robot–
aided–rehabilitation tasks, in comparison with manual ther-
apy [10], e.g., the recovery process of a person who has
suffered an stroke[7] is more effective basically because
rehabilitation robotics allows a less subjective assessment in
comparison when using conventional therapy approaches.
Focusing in robot–aided arm therapy issues, there is a
vast literature as well as commercial robots for rehabilitating
the motion of human arms, wherein HRI is included in its
design criteria to assist the robots, in particular within virtual
environments to render more interactivity and high level of
comfort during recovery sessions for the patient, [6]. This
kind of rehabilitation robots for arm therapy can be classified
into, a) End–Effector-based robots, and b) Exoskeletons
1
[14]. The main difference between these two approaches
This manuscript was received on March 30th of 2009.
This work was supported by SKILLS-IP and Scuola Superiore Sant’Anna.
Luis I. Lugo-Villeda, Antonio Frisoli, Oscar Sandoval–Gonz´ alez,
Miguel Padilla, Carlo A. Avizzano, Emanuele Ruffaldi and
Massimo Bergamasco are with Perceptual Robotics–PERCRO ,
Scuola Superiore Sant’Anna, Via Martiri della Libert` a, Pisa, Italy
{l.lugovilleda,a.frisoli}@sssup.it
Vicente Parra-Vega is with Robotics and Advanced Manufacturing Divi-
sion,Research Center for Advanced Studies Saltillo Campus. CINVESTAV
Carretera Saltillo-Monterrey Km 1.5 - CP 25000 - Ramos-Arizpe, Coahuila,
Mexicovparra@cinvestav.mx
1
All the cited exoskeletons belong to Upper Part Fixed–Frame Exoskele-
ton (UFBEx).
relies on the fact that exoskeletons are anthropomorphic
robots tightly attached to human–limbs, whose joint axes
fully determines the arm pose and time variations, while it
is not necessarily the case for End–Effector-based robots,
such as the MIT Manus[9]
2
, which trains patients who have
suffered an stroke and have lost arm motor skills. In case of
a), the main purpose of MIT–Manus is guiding the human–
arm in virtual environments to display desired and real
reaching exercise carried out by the patient. Another exam-
ple of a) is the Mirror Image Movement Enabler system,
[12], a PUMA 560-based robot which can impose bilateral
3–D force–position motions. Likewise, the Bi–manu-track
robot for upper–limb rehabilitation uses active practice of
forearm motions, such as pronation/supination, and wrist
flexion/extension into mirror like fashion [17]. EU Gentle/s
Project presents a 3 degree–of–freedom (DoF) haptic device
for right–handed subjects, which uses virtual environments
for reaching exercises to tackle the arm tracking of smooth
trajectories based on polynomials, [1]. Finally, ARM-guide,
[16], drives the forearm along linear position profile into 2-D
space having a guided force training in joint space.
Under the second classification b) we have the Armin II,
an exoskeleton for human–arm rehabilitation whose 6 DoF
have been used in conjunction with virtual environments,
suitable for helping post–stroke patients. The exoskeleton
uses a simple PD plus gravity compensation controller to
obtain apparent interactive forces, which are computed in the
basis of virtual environment interaction [13]. Finally, 7–DoF
upper–limb exoskeleton which can be used for therapeutic
diagnostics and for physiotherapy, or as a haptic device in
virtual reality, is presented in [15], however it does not
introduces neither the techniques for doing rehabilitation nor
the control scheme for assuring stability.
In the realm of HRI used for rehabilitation purposes,
there arise two important facts: (i) simple control regulator
schemes have been used, therefore limiting its scope and
impact in demanding tasks such as rehabilitation–based
robotics, an essentially a tracking task. (ii) Can we induce
performance betterment when using signals coming directly
from patient measurements during rehabilitation tasks? These
questions are addressed this paper in the framework of hap-
tic guidance using preliminary EMG patient readings with
promising results. Notice that haptic guidance for UFBEx has
been done indirectly with conventional controllers, however
they do not exploit at maximum the exoskeleton performance
since more simple controllers are implemented. In this paper
2
A pantograph-based manipulator which works into 2–D.
The 18th IEEE International Symposium on
Robot and Human Interactive Communication
Toyama, Japan, Sept. 27-Oct. 2, 2009
ThA4.2
978-1-4244-5081-7/09/$26.00 ©2009 IEEE 903