Proceedings of the 2017 Winter Simulation Conference W. K. V. Chan, A. D’Ambrogio, G. Zacharewicz, N. Mustafee, G. Wainer, and E. Page, eds. DEEP GAUSSIAN PROCESS METAMODELING OF SEQUENTIALLY SAMPLED NON-STATIONARY RESPONSE SURFACES Vincent Dutordoir Nicolas Knudde Joachim van der Herten Ivo Couckuyt Tom Dhaene Ghent University - imec IDLab iGent Tower - Department of Information Technology Technologiepark-Zwijnaarde 15 B-9052 Ghent, BELGIUM ABSTRACT Simulations are often used for the design of complex systems as they allow to explore the design space without the need to build several prototypes. Over the years, the simulation accuracy, as well as the associated computational cost has increased significantly, limiting the overall number of simulations during the design process. Therefore, metamodeling aims to approximate the simulation response with a cheap-to-evaluate mathematical approximation, learned from a limited set of simulator evaluations. Kernel-based methods using stationary kernels are nowadays wildly used. However, using stationary kernels for non-stationary responses can be inappropriate and result in poor models when combined with sequential design. We present the application of a novel kernel-based technique, known as Deep Gaussian Processes, which is better able to cope with these difficulties. We evaluate the method for non-stationary regression on a series of real-world problems, showing that it outperforms the standard Gaussian Processes with stationary kernels. 1 INTRODUCTION During the design and analysis of complex systems, computer simulations are often used to limit the number of expensive prototypes or real-life measurements. While adding more flexibility to study phenomena under controlled conditions, high-accuracy computer simulations often require a substantial investment of computational resources and time. A single simulation may easily take several hours or even days to complete. This is especially problematic for routine tasks such as optimization, sensitivity analysis and design space exploration. As a result, over the past decade, a lot of research has focused on alleviating these computational issues. An accomplished technique is the construction of a mathematical equation which approximates the behavior of the simulation as closely as possible, while being computationally cheap(er) to evaluate. If the approximation is based on a (limited) set of simulator evaluations, this is approximation is known as a metamodel (Kleijnen 2015, Gorissen et al. 2010). The metamodel may then be used instead of the computationally expensive simulator to derive characteristics of the problem at hand or to search for optima. More formally, we sample the expensive simulation N times at different locations x i in order to get the simulation response f i f (x i ) and then collect these into a (training) dataset {x i ,f i } N i=1 . Here we assume the simulation is deterministic and noise-free (with the exception of quantization errors due to the finite representations of real numbers). We subsequently use this dataset to construct a metamodel aiming