Co-Location of Force and Action Improves Identification of
Force-Displacement Features
Jeremy D. Brown
*
University of Michigan
Ann Arbor, MI USA
R. Brent Gillespie
†
University of Michigan
Ann Arbor, MI USA
Duane Gardner
‡
University of Michigan
Ann Arbor, MI USA
Emmanuel A. Gansallo
§
University of Michigan
Ann Arbor, MI USA
ABSTRACT
Haptic display is a promising means to deliver sensory feedback to
an amputee from an upper limb prosthesis equipped with electronic
sensors. Haptics, however, describes a diverse set of sensory and
perceptual modalities. The question arises: which modality might
best serve the purposes of the prosthesis wearer, and which body
site should be used? To begin to answer these questions, we have
conducted an experiment involving n=14 participants in which re-
action force was displayed either to the same hand used to explore a
virtual object (co-located condition), or to the opposing hand (non
co-located condition). In randomly ordered trials, reaction forces
were derived from the commanded motion according to one of three
force-displacement relationships, describing a linear spring, a soft-
ening spring, and a stiffening spring. All springs shared a common
rest length and terminal force. Results indicate a significant dif-
ference between the co-located and non co-located force display
conditions in terms of identification accuracy and time length. Our
findings suggest that those haptic modalities that are capable of cou-
pling action and re-action will provide the most utility to amputees
with an upper limb prosthesis.
Keywords: human-machine interface, prosthetics, sensory substi-
tution
1 I NTRODUCTION
Technological advances in electronics and mechanical hardware
have opened up a tremendous opportunity to engineer new pros-
thetic devices that fully replicate the form and function of a lost
upper limb. Advances in neural interface, including capturing user
intent and relaying sensory feedback through bioelectronic trans-
ducers appear to hold significant promise as well, but these wet-
ware technologies seem to lie a bit further out than the hardware
[1, 4, 15, 20, 24]. Is there anything to be done to interface new
prosthetic designs in the meantime? What can be done to discern
user intent without breaking the skin, and what can be done to en-
gage physiological receptors for sensory feedback?
To discern user intent noninvasively, a number of technologies
are available and even commercialized, including electromyogra-
phy (EMG) and slaving prosthetic joints to physiological joints on
the body (body-powered prosthetics). To display signals acquired
by electronic sensors mounted on the prosthesis, however, relatively
few technologies have been developed. Direct vision is already
available to the prosthesis wearer of course, but one of the fea-
tures most requested by prosthesis users is the ability to perform
tasks without having to visually monitor interactions between the
prosthesis and task object [2].
*
e-mail: jdelaine@umich.edu
†
e-mail:brentg@umich.edu
‡
e-mail:dgardn@umich.edu
§
e-mail:emmangan@umich.edu
It is useful to note that any attempt to relay sensory feedback
from electronic transducers on a prosthesis to the body of the am-
putee is a challenge in sensory substitution. Generally, the sensory
receptors and nerve bundles that previously served the hand are no
longer available to be stimulated across the skin. What is stimu-
lated at the display site, the residual limb or other part of the body,
pertains to a distal sensing site, a site on the worn prosthesis. Thus,
there is to some extent a referral or translation that must be resolved
by the brain. Ideally, the prosthesis will be adopted into the body
schema of the user—the prosthesis becomes a part of the body.
Waiting to be developed for application in prosthetics is quite a
suite of haptic display technologies: vibrotactile [7, 9, 14, 23, 26],
skin stretch [17, 28], squeeze and nudge displays [11], force fee-
back [21] and force feedback through exoskeletons that span joints
on the residual body [12, 13], motion display [5, 18], and maybe
electrocutaneous stimulation. Perhaps if we could characterize the
information flow rates and the just noticeable differences for each
of these display technologies, we would be in a better position to
select the best technology for a given prosthesis. The displays that
achieve the highest information transmission rates would certainly
rise to the top. However, it might be the case that the appropri-
ateness of a particular display technology depends on the task for
which the prosthesis is employed. Indeed, we expect that the tra-
ditional psychophysical parameters, which explain the absolute and
difference limens of a particular stimulus, will not provide sufficient
task-specific guidelines for designing displays that incorporate that
stimulus into the interface of recently developed prosthesis devices.
In particular, sensorimotor processing on the part of the user must
be accommodated by the interface design.
Discrimination of object stiffness is the facet of haptic explo-
ration that we have used to begin our exploration of the relationship
between display modality and task. Stiffness, or its inverse com-
pliance, is a property of an object that may be determined through
haptic exploration and often plays an important role in manipula-
tion. Stiffness expresses the relationship between force and dis-
placement and cannot be determined without deforming an object.
Also, even though stiffness can be inferred from vision alone [10],
the measurement has high variability, and depends heavily upon the
ability to discriminate object deformations under common forcing
conditions. Note that stiffness variations that occur during object
deformation or because of nonlinear force-displacement relation-
ships often suggest brittleness or a tendency to break. Also, except
by the visual methods described above, stiffness discrimination is
not possible with today’s commercially available myoelectric upper
limb prostheses. Yet stiffness is an important cue for identifying ob-
jects, and for assessing the composition of objects and tissues. We
discriminate stiffness when we squeeze a fruit in the grocery store,
shake hands, and sort objects in our pocket.
The stiffness of rigid objects and objects with surfaces that are
deformable beyond the sensitivity of the cutaneous senses is en-
coded in a force/motion relationship [3, 25]. Therefore, when dis-
criminating stiffness using sensory substitution, it seems that force
feedback would be most appropriate. To investigate this claim, a
comparison of force feedback and say, vibrotactile feedback would
be one place to start. In such a case, however, it would be incumbent
187
IEEE Haptics Symposium 2012
4-7 March, Vancouver, BC, Canada
978-1-4673-0809-0/12/$31.00 ©2012 IEEE
Authorized licensed use limited to: Johns Hopkins University. Downloaded on July 28,2020 at 12:35:10 UTC from IEEE Xplore. Restrictions apply.