Sensitivity analysis of extreme inaccuracies in Gaussian Bayesian Networks Miguel A. G´omez-Villegas and Paloma Ma´ ın ma gv@mat.ucm.es, pmain@mat.ucm.es Departamento de Estadstica e Investigacin Operativa Universidad Complutense de Madrid Plaza de Ciencias 3, 28040 Madrid, Spain Rosario Susi rsusi@estad.ucm.es Departamento de Estadstica e Investigacin Operativa III Universidad Complutense de Madrid Avda. Puerta de Hierro s/n, 28040 Madrid, Spain Abstract We present the behavior of a sensitivity measure defined to evaluate the impact of model inaccuracies over the posterior marginal density of the variable of interest, after the evidence propagation is executed, for extreme perturbations of parameters in Gaussian Bayesian networks. This sensitivity measure is based on the Kullback-Leibler divergence and yields different expressions depending on the type of parameter (mean, variance or covariance) to be perturbed. This analysis is useful to know the extreme effect of uncer- tainty about some of the initial parameters of the model in a Gaussian Bayesian network. These concepts and methods are illustrated with some examples. Keywords: Gaussian Bayesian Network, Sensitivity analysis, Kullback-Leibler diver- gence. 1 Introduction Bayesian network is a graphical probabilistic model that provides a graphical framework for complex domains with lots of inter-related vari- ables. Among other authors, Bayesian networks have been studied, by Pearl (1988), Lauritzen (1996), Heckerman (1998) and Jensen (2001). A sensitivity analysis in a Bayesian network is necessary to study how sensitive is the network’s output to inaccuracies or imprecisions in the pa- rameters of the initial network, and therefore to evaluate the network robustness. In recent years, some sensitivity analysis tech- niques for Bayesian networks have been devel- oped. In Discrete Bayesian networks Laskey (1995) presents a sensitivity analysis based on computing the partial derivative of a posterior marginal probability with respect to a given pa- rameter, Coup´ e, et al. (2002) develop an effi- cient sensitivity analysis based on inference al- gorithms and Chan, et al. (2005) introduce a sensitivity analysis based on a distance measure. In Gaussian Bayesian networks Castillo, et al. (2003) present a sensitivity analysis based on symbolic propagation and G´omez-Villegas, et al. (2006) develop a sensitivity measure, based on the Kullback-Leibler divergence, to perform the sensitivity analysis. In this paper, we study the behavior of the sensi- tivity measure presented by G´omez-Villegas, et al. (2006) for extreme inaccuracies (perturba- tions) of parameters that describe the Gaussian Bayesian network. To prove that this is a well-