A virtual piano-playing environment for rehabilitation
based upon ultrasound imaging
Claudio Castellini, Katharina Hertkorn, Mikel Sagardia, David Sierra Gonz´ alez and Markus Nowak
Abstract— In this paper we evaluate ultrasound imaging as
a human-machine interface in the context of rehabilitation.
Ultrasound imaging can be used to estimate finger forces in
real-time with a short and easy calibration procedure. Forces
are individually predicted using a transducer fixed on the
forearm, which leaves the hand completely free to operate. In
this application, a standard ultrasound machine is connected
to a virtual-reality environment in which a human operator
can play a dynamic harmonium over two octaves, using either
finger (including the thumb). The interaction in the virtual
environment is managed via a fast collision detection algorithm
and a physics engine.
Ten human subjects have been engaged in two games of
increasing difficulty. Our experimental results, both objective
and subjective, clearly show that both tasks could be accom-
plished to the required degree of precision and that the subjects
underwent a typical learning curve. The learning happened
uniformly, irrespective of the required finger, force or note. Such
a system could be made portable, and has potential applications
as rehabilitation device for amputees and muscle impaired, even
at home.
I. I NTRODUCTION
Standard ultrasound imaging as employed in hospitals
(also called medical ultrasonography, US imaging from now
on) is a non-invasive technique to visualise structures inside
the human body [9] exploiting the principle of wave reflec-
tion. Piezoelectric transducers are used to generate a focused
wave of ultrasound which penetrates the body part of interest;
partial reflection of the wave at the interfaces between tissues
with different acoustic impedance is converted to a grey-
scale 2D image (in the so-called B-mode). High values of
grey denote tissue interfaces. US imaging has no known side
effects [36] and is routinely used in hospitals. It can be used
in, e.g., recognition of skin cancer [18], tumor segmentation
[37], and anatomical landmarks detection in the foetus [27].
Recently, the usage of this technique as a human-machine
interface has been employed to visualise residual lower limbs
and to assess the ergonomy of lower-limb prostheses [26],
[11]. As a human-machine interface for control of upper-
limb prosthesis, ultrasound imaging has been explored by
Zheng and others [38], [8], [17] and Castellini and others
[5], [6], revealing that it can actually be used to reconstruct
the hand and wrist configuration to a remarkable degree of
precision. Sikdar et al. [34] have demonstrated a system
able to classify finger motions, and predict finger motion
velocities, based upon k-nearest-neighbours. In particular,
a very fast and realistic calibration procedure has been
The authors are with the Robotics and Mechatronics Center, DLR
(German Aerospace Center), 82234 Oberpfaffenhofen, Germany. email
claudio.castellini@dlr.de
Fig. 1. The virtual-harmonium setup has three main components: an
ultrasound machine capturing images of the forearm, a magnetic tracker
to detect the hand movement and a realistic virtual reality environment.
developed in previous work to estimate finger forces of a
human hand by employing forearm US images [33]. The
system so obtained runs at 30Hz and can work incrementally,
meaning that the calibration can be updated at will whenever
required without the need of retraining.
As a human-machine interface for virtual reality, US imag-
ing enjoys a number of advantages with respect to standard
approaches to human-machine interfaces that are specifically
designed for the hand. They traditionally measure the hand
kinematic configuration rather than forces and torques: in-
strumented gloves [10], marker-based finger tracking [1] and
markerless optical tracking (Leap motion, Kinect, [23], [19]).
In some cases haptic feedback is provided, improving the
feeling of immersion [31] and the performance [15], [22].
Although many haptic devices and exoskeletons exist [12],
[32], dexterous finger feedback remains an open topic. In
contrast, US imaging can detect single finger forces as well
as positions to a remarkable degree of precision [6], [33]
and, since the only device needed in the online functioning
is the ultrasound transducer on the forearm, it leaves the
hand completely free to operate without any mechanical
hampering.
This leads naturally to one main application: to reconstruct
the desired finger positions and/or forces of an amputee
using the US images of the stump. It is well-known [4],
[35] that a remarkable amount of residual muscular activity
is present in the stump even decades after the operation;
2014 5th IEEE RAS & EMBS International Conference on
Biomedical Robotics and Biomechatronics (BioRob)
August 12-15, 2014. São Paulo, Brazil
978-1-4799-3128-6/6/14/$31.00 ©2014 IEEE 548