CEIG’09, San Sebastián, Sept. 9-11 (2009)
C. Andújar and J. Lluch (Editors)
© The Eurographics Association 2009.
A Framework for Rendering, Simulation and
Animation of Crowds
N. Pelechano, B. Spanlang, A. Beacco
Universitat Politècnica de Catalunya, Barcelona, Spain
Abstract
Real-time crowd simulation for virtual environment applications requires not only navigation and locomotion in
large environments while avoiding obstacles and agents, but also rendering high quality 3D fully articulated fig-
ures to enhance realism. In this paper, we present a framework for real-time simulation of crowds. The framework
is composed of a Hardware Accelerated Character Animation Library (HALCA), a crowd simulation system that
can handle large crowds with high densities (HiDAC), and an Animation Planning Mediator (APM) that bridges
the gap between the global position of the agents given by HiDAC and the correct skeletal state so that each agent
is rendered with natural locomotion in real-time.
The main goal of this framework is to allow high quality visualization and animation of several hundred realistic
looking characters (about 5000 polygons each) navigating virtual environments on a single display PC, a HMD
(Head Mounted Display), or a CAVE system. Results of several applications on a number of platforms are pre-
sented.
Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three-Dimensional
Graphics and Realism – Animation; I.6.8 [Simulation and Modeling]: Types of Simulation – Animation.
1. Introduction
Natural looking crowd simulation is necessary for appli-
cations such as video games and training simulators. Natu-
ral looking animations become even more important in
immersive applications where a real participant can interact
in real-time with characters in a virtual environment. If our
goal is to be able to interact with relatively large number of
agents, it is necessary that those agents perform natural
path planning and animations, making sure that there is
consistency between what we would expect in the virtual
world from our experiences in the real world.
In order to achieve visually appealing results it is neces-
sary to integrate motion planners with motion synthesizers.
Most crowd simulation systems focus on the path planning
and local motion details of the agents while using a charac-
ter animation library to render 3D fully articulated charac-
ters playing a small set of animation clips. The main prob-
lem that can be easily observed is that the characters have a
very limited number of movements available and that con-
straints such as ground contact points are not respected.
Generating empathetic characters with realistic move-
ments and responses to other virtual agents and obstacles in
the environment can be very tedious work, since they usu-
ally require large number of animation clips to be able to
exhibit a large variety of movements. Being able to gener-
ate synthesized motions from small sets of animation clips
is thus of extreme relevance especially if we want to exhibit
variety while following the requirements given by a crowd
simulation module dealing with path planning and local
motion.
In this paper we present a framework for real-time crowd
simulation which integrates a crowd simulation module
that determines the position, velocity and orientation of
each agent at every frame with a motion synthesizer that
will guarantee smooth and natural looking animation for
every agent.
In order to integrate both modules we developed an
Animation Planning Mediator which with the information