Tactile Sensing: Steps to Artificial Somatosensory Maps
Giorgio Cannata, Simone Denei, Fulvio Mastrogiovanni
Abstract— In this paper a framework for representing tactile
information in robots is discussed. Control models exploiting
tactile sensing are fundamental in social Human-Robot in-
teraction tasks. Difficulties arising in rendering the sense of
touch in robots are at different levels: both representation and
computational issues must be considered. A layered system is
proposed, which is inspired from tactile sensing in humans for
building artificial somatosensory maps in robots. Experiments
in simulation are used to validate the approach.
I. I NTRODUCTION
Robots exploiting tactile information are expected to ex-
hibit advanced capabilities in physical and social Human-
Robot Interaction (HRI in short). The sense of touch is a
fundamental feature for control models based on physical
interaction cues. Appropriate social and physical stimuli are
needed to enhance the quality of the interaction in terms of
robot behaviour and responsiveness.
To date, studies in HRI have been largely devoted to inves-
tigate suitable models for modulating interaction behaviours
at the social level [17]. Aspects related to physical interaction
received considerable attention mostly with respect to tactile
sensing, and specifically to transduction technologies [5].
Although the need arises to integrate information from both
physically and socially oriented models of interaction, the
direct use of tactile data in designing control strategies
enforcing social interaction rules did not receive considerable
attention in literature.
One possibility is to design appropriate representation
structures to mimic somatosensory mapping in humans.
From one side, these structure must guarantee an unique
mapping between the tactile elements on the robot surface
and their representation; from the other side, they must be
accessible from high level behaviours implementing social
models of interaction. During the past few years, a number
of approaches have partially addressed these key issues. A
model aimed at translating contact phenomena into language
like symbols has been presented in [23], where the focus is
more on the relationship between numerical and symbolic
data rather than in the use of such information at the control
level. The work presented in [15] faces the problem of
the emergent behaviour through sensory-motor interaction
between an agent capable of full body movements and the
surrounding environment. A somatosensory map is obtained
All the authors are with the Department of Communication, Computer
and System Sciences, University of Genova, Via Opera Pia 13, 16145,
Genova, Italy. Corresponding Author email:fulvio@dist.unige.it.
The research leading to these results has received funding from the
European Community’s Seventh Framework Programme (FP7/2007-2013)
under grant agreement n. 231500/ROBOSKIN.
by correlating signals from tactile sensors distributed over
the agent surface. A similar approach, based on Informa-
tion Theory, has been proposed in [20], where sensoritopic
maps of groups of sensors are created using self-organizing
processes. Feedback from groups of tactile sensors has been
used in [19] to determine sensoritopic connections between
correlated taxels: a mostly manual learning process is used to
activate groups of nearby taxels, which are then considered
topographically close to each other.
Both representation and computational issues must be
considered when designing artificial somatosensory maps:
• Designing internal models for representing tactile in-
formation is a novel research attempt. Differently from
vision images, tactile images can not be easily flattened
on a 2D metric space, since they originate from elements
that are located on curved surfaces. Such concepts as
proximity, feature extraction or data filtering can not
be easily applied. Furthermore, cameras provide infor-
mation from a well-defined location in space, whereas
taxels are distributed over large parts of the robot
surface, which are subject to kinematics constraints. A
natural representation preserving skin topology must be
available.
• Data structures must guarantee an easy access to a
semantically well-defined tactile information. Tactile
data at different resolution must be accessible according
to the task at hand: high resolution tactile images are
needed for fine contact dynamics, whereas reactive be-
haviours can be attained by manipulating low resolution
information with an associated negligible computational
load. The characteristics of the contact must be acces-
sible from the representation.
Approaches in the literature do not maintain the map-
ping between the location of taxels in 3D space and their
representation in the artificial somatosensory maps. This is
fundamental to use tactile data in practice, e.g., when reacting
to sudden contacts. The main contribution of this work is
a model of somatosensory maps addressing representation
issues, which is loosely inspired by tactile rendering in
humans. However, it also paves the way for further develop-
ments on the computational side, which are outside the scope
of the present discussion. Inspired by the beautiful images
of [24], a hierarchical architecture for tactile rendering is
proposed that exploits Surface Parameterization techniques
to model somatosensory maps.
The paper is organized as follows. Section II describes the
actual tactile information processing architecture in humans.
19th IEEE International Symposium on Robot and
Human Interactive Communication
Principe di Piemonte - Viareggio, Italy, Sept. 12-15, 2010
978-1-4244-7989-4/10/$26.00 ©2010 IEEE 576
978-1-4244-7990-0/10/$26.00 ©2010 IEEE