COMPUTER ANIMATION AND VIRTUAL WORLDS
Comp. Anim. Virtual Worlds 2007; 18: 463–472
Published online 13 July 2007 in Wiley InterScience
(www.interscience.wiley.com) DOI: 10.1002/cav.185
...........................................................................................
A steering model for on-line
locomotion synthesis
By Taesoo Kwon and Sung Yong Shin
*
..........................................................................
For applications such as video games and virtual walk-throughs, on-line locomotion
control is an important issue. In general, the user prescribes a sequence of motions one by
one while providing an input trajectory. Since the input trajectory lacks in human
characteristics, one may not synthesize quality motions by blindly following it. In this
paper, we present a novel data-driven scheme for transforming a user-prescribed trajectory
to a human trajectory in an on-line manner. As preprocessing, we analyze example motion
data to extract human steering behavior. At run-time, the input trajectory is refined to
reflect the steering behavior. Together with an existing on-line motion synthesis system, our
scheme forms a feedback loop, in which the user effectively specifies an intended human
trajectory. Copyright © 2007 John Wiley & Sons, Ltd.
Received: 11 May 2007; Accepted: 14 May 2007
KEY WORDS: computer animation; character animation; motion control
Introduction
On-line, real-time locomotion synthesis is an important
issue in applications such as video games and
virtual walk-throughs. Although rich researches have
been done in locomotion synthesis, the issue of on-
line steering of human-like character has not been
addressed well.
To get a concrete feel, consider Figure 1. Figure 1(a) and
(b) show how user-specified trajectories (colored red) are
different from actual human trajectories (colored green)
during straight walking and running. The oscillations of
the actual trajectories are due to the pelvis movements
(rotations and translations) caused by supporting feet.
Figure 1(c) and (d) show the variations of actual
pelvis trajectories during curved walking and running.
Such oscillations or curvature variations are the unique
characteristics of human steering behavior. Simply
placing the pelvis along a user-specified trajectory would
not produce a natural motion. This immediately raises
an issue: how to incorporate these characteristics into a
user-specified trajectory.
For on-line applications, the user commonly prescribes
a motion by interactively providing a motion type and its
*Correspondence to: S. Y. Shin, Korea Advanced Institute of
Science and Technology, Taejon,Korea.
E-mail: syshin@jupiter.kaist.ac.kr
trajectory. In particular, the trajectory is specified either
explicitly by a point stream that is sampled with an
input device such as a mouse, or implicitly by a force
profile that is given with a user interface equipped with
slide bars (or a joystick). The former directly produces
the trajectory of a human-like figure. Although it is
easy to specify, the trajectory itself is neither precise
nor smooth. Moreover, it is far from a natural human
trajectory. On the other hand, integrating an input force
profile that is sampled at each frame, the latter yields a
smooth trajectory in an equally easy manner. However,
the resulting trajectory is not natural, either. In either
case, little effort has been made to produce a natural
human trajectory.
No matter what method we employ, it would be
difficult to generate high-quality locomotion with such
a poor trajectory. In this paper, we present a data-driven
method for refining a input trajectory for on-line, real-
time locomotion synthesis, given the type of a locomotive
motion. Choosing the center of the pelvis as the root of an
articulated character, we describe how to yield a natural
pelvis trajectory from the input trajectory. Without loss
of generality, we assume that the input trajectory is
given in an explicit form, that is, in the form of a point
stream sampled at each frame. The refined trajectory
gives the global pelvis position and orientation at each
frame. The refinement is performed frame by frame in
an on-line manner. Our method performs a two-step
............................................................................................
Copyright © 2007 John Wiley & Sons, Ltd.