MASSIVELY PARALLEL SIMULATIONS OF ASTROPHYSICAL PLASMAS:
STATUS AND PERSPECTIVES OF THE COAST PROJECT
B. Thooris, E. Audit, A. S. Brun, Y. Fidaali, F. Masset, D. Pomarède, R. Teyssier
Institut de Recherche sur les Lois Fondamentales de l’Univers
DSM/IRFU CEA/Saclay 91191 Gif-sur-Yvette France
email: Bruno.Thooris@cea.fr
http://irfu.cea.fr/Projets/COAST
KEYWORDS
Large scale computing, parallel computing, astrophysics,
plasmas simulation, visualization.
ABSTRACT
The COAST (for Computational Astrophysics) project is
a program of massively parallel numerical simulations in
astrophysics involving astrophysicists and software
engineers from CEA/IRFU Saclay. The scientific
objective is the understanding of the formation of
structures in the Universe, including the study of large-
scale cosmological structures and galaxy formation,
turbulence in interstellar medium, stellar
magnetohydrodynamics and protoplanetary systems. The
simulations of astrophysical plasmas are performed on
massively parallel mainframes (MareNostrum Barcelona,
CCRT CEA France), using 3-D magnetohydrodynamics
and N-body parallelized codes developed locally. We
present in this paper an overview of the software codes
and tools developed and some results of such simulations.
We also describe the Saclay SDvision graphical interface,
implemented in the framework of IDL Object graphics,
our 3-D visualization tool for analysis of the computation
results.
1. Introduction
The COAST project [1,2] is dedicated to high
performance computing in astrophysics. The goal is the
understanding of the formation of structures in the
Universe, by developing advanced techniques in parallel
computing and in applied mathematics to model galaxy
formation and predict their observational signatures, as a
function of physical parameters. Astrophysicists and
software engineers collaborate to rationalize and optimize
the development of simulation programs by creating a
core of common specific modules and using common
software tools for data handling, post-treatment,
visualization, numerical methods, parallelization and
optimization.
2. Overview of the simulation programs
Four major numerical simulation programs are used to
cover different physics scales:
- The RAMSES code
RAMSES [3,4,5,6] is a hybrid, N-body and
hydrodynamical 3-D code which solves the interplay of
the dark matter component and the baryon gas for
studying the structure and the distribution of galaxy
clusters starting for the initial conditions of the Big Bang.
The code is based on the Adaptive Mesh Refinement
(AMR) technique, written in FORTRAN90 and
parallelized with the MPI library [7]. Current
developments focus on solving the full MHD set of
equations.
- The HERACLES code
HERACLES [8,9,10,11,12] is a 3-D code which solves
the equations of radiative transfer coupled to
hydrodynamics. It studies thermal condensation in
molecular clouds in the Interstellar Medium, radiative
shocks, molecular jets of young stars and proto-planetary
disks. It is written in FORTRAN90, parallelized with MPI
and implemented in cartesian, cylindrical and spherical
coordinates with regular mesh grids.
- The ASH [13,14] code
ASH (for Anelastic Spherical Harmonic) performs 3-D
magnetohydrodynamics simulations in spherical geometry
for the study of the turbulence and magnetic dynamo
process in solar and stellar interiors. ASH, unlike the
others codes presented which are completely developed in
CEA/Saclay, is jointly developed at Saclay and at the
University of Boulder.
- The JUPITER [15,16,17] code
JUPITER is a mutidimensional astrophysical hydrocode.
It is based on a Godunov method, written in C and
parallelized with MPI. The mesh geometry can either be
Proceedings of the 2008 High Performance
Computing & Simulation Conference ©ECMS
Waleed W. Smari (Ed.)
ISBN: 978-0-9553018-7-2 / ISBN: 978-0-9553018-6-5 (CD)