1 3
Exp Brain Res
DOI 10.1007/s00221-014-4011-0
RESEARCH ARTICLE
Contributions of visual and proprioceptive information
to travelled distance estimation during changing sensory
congruencies
Jennifer L. Campos · John S. Butler ·
Heinrich H. Bülthoff
Received: 22 November 2013 / Accepted: 31 May 2014
© Springer-Verlag Berlin Heidelberg 2014
combined. In the combined cue condition, the relationship
between the two cues was manipulated by either changing
the visual gain across trials (0.7×, 1.0×, 1.4×; Exp. 1) or
the proprioceptive gain across trials (0.7×, 1.0×, 1.4×;
Exp. 2). Results demonstrated an overall higher weight-
ing of proprioception over vision. These weights were
scaled, however, as a function of which sensory input pro-
vided more stable information across trials. Specifically,
when visual gain was constantly manipulated, propriocep-
tive weights were higher than when proprioceptive gain
was constantly manipulated. These results therefore reveal
interesting characteristics of cue-weighting within the con-
text of unfolding spatio-temporal cue dynamics.
Keywords Optic flow · Proprioception · Multisensory
integration · Distance estimation · Self-motion ·
Cue conflict
Introduction
During everyday walking, dynamic visual information and
information from the motor and vestibular systems are
intrinsically linked. Understanding the interplay between
these sensory signals is important for a wide variety of
behaviours and applications. Studying tasks related to self-
motion perception also provides a unique opportunity to
better understand the mechanisms of multisensory integra-
tion during causally related interactions between internal
(proprioceptive/vestibular/efference) and external (visual)
sensory information.
A popular approach to quantifying relative cue-weight-
ing has been to create a subtle conflict between the spatial
or temporal characteristics of two or more sensory cues. In
the context of self-motion perception, this has been done
Abstract Recent research has provided evidence that
visual and body-based cues (vestibular, proprioceptive
and efference copy) are integrated using a weighted linear
sum during walking and passive transport. However, little
is known about the specific weighting of visual informa-
tion when combined with proprioceptive inputs alone, in
the absence of vestibular information about forward self-
motion. Therefore, in this study, participants walked in
place on a stationary treadmill while dynamic visual infor-
mation was updated in real time via a head-mounted dis-
play. The task required participants to travel a predefined
distance and subsequently match this distance by adjust-
ing an egocentric, in-depth target using a game controller.
Travelled distance information was provided either through
visual cues alone, proprioceptive cues alone or both cues
J. L. Campos · J. S. Butler · H. H. Bülthoff (*)
Department of Human Perception, Cognition and Action, Max
Planck Institute for Biological Cybernetics, Spemannstr. 38,
72076 Tübingen, Germany
e-mail: heinrich.buelthoff@tuebingen.mpg.de
J. L. Campos (*)
Toronto Rehabilitation Institute, University Health Network,
550 University Ave., Toronto, ON M5G 2A2, Canada
e-mail: Jennifer.Campos@uhn.ca
J. L. Campos
Department of Psychology, University of Toronto,
Toronto, ON, Canada
J. S. Butler
Trinity Centre for Bioengineering, Trinity Biomedical Science
Institute, Trinity College, Dublin, Ireland
H. H. Bülthoff
Department of Brain and Cognitive Engineering,
Korea University, Seoul 136-713, Korea