Stochastics and Statistics Merging experts’ opinions: A Bayesian hierarchical model with mixture of prior distributions M.J. Rufo * , C.J. Pérez, J. Martín Departamento de Matemáticas, Universidad de Extremadura, Avda. de la Universidad s/n, 10071 Cáceres, Spain article info Article history: Received 15 December 2008 Accepted 7 April 2010 Available online 13 April 2010 Keywords: Bayesian analysis Conjugate prior distributions Exponential families Prior mixtures Kullback–Leibler divergence abstract In this paper, a general approach is proposed to address a full Bayesian analysis for the class of quadratic natural exponential families in the presence of several expert sources of prior information. By expressing the opinion of each expert as a conjugate prior distribution, a mixture model is used by the decision maker to arrive at a consensus of the sources. A hyperprior distribution on the mixing parameters is con- sidered and a procedure based on the expected Kullback–Leibler divergence is proposed to analytically calculate the hyperparameter values. Next, the experts’ prior beliefs are calibrated with respect to the combined posterior belief over the quantity of interest by using expected Kullback–Leibler divergences, which are estimated with a computationally low-cost method. Finally, it is remarkable that the proposed approach can be easily applied in practice, as it is shown with an application. Ó 2010 Elsevier B.V. All rights reserved. 1. Introduction The choice of suitable prior distributions is not a simple task where Bayesian methods are applied, particularly, when issues re- lated to analysis of experts’ opinions and decision making are dealt with (see Korhonen et al. (1992), for a review of multiple criteria decision making problems). Often, the prior distribution is chosen to approximately reflect the initial expert’s opinion. In this context, a common choice is a conjugate prior distribution. However, in some situations, a single conjugate prior distribution may be inad- equate to accurately reflect available prior knowledge. Dalal and Hall (1983) and Diaconis and Ylvisaker (1985) showed that it is possible to extend these distributions through the use of mixtures of conjugate prior distributions (see also Lijoi (2003) for a more recent study). The main advantage is that mix- tures of conjugate prior distributions can be sufficiently flexible (allowing, for example, multimodality), while they make simplified posterior calculations possible (since they are also conjugate fam- ilies). Some interesting applications on prior mixtures can be found in Savchuk and Martz (1994), Liechty et al. (2004), and Atwood and Youngblood (2005). This paper provides a general framework that allows to perform a full Bayesian analysis for natural exponential families with qua- dratic variance function (NEF-QVF) by using mixtures of conjugate prior distributions with unknown weights. These families have been considered because they contain distributions very com- monly used in real applications, such as Poisson, binomial, nega- tive-binomial, normal or gamma. Throughout the paper, it is assumed that a decision maker con- sults several sources about a quantity of interest. Therefore, it is con- sidered that the prior information comes from several sources such as experts. The opinion of each expert is elicited as a conjugate dis- tribution over a quantity of interest (see, e.g., Szwed et al. (2006) for a particular case of prior distribution specification). Then, the deci- sion maker combine the experts’ distributions by using a mixture model in order to represent a consensus of several experts. Chen and Pennock (2005) observed that the weights selection is an incon- venience of this approach. Sometimes, the weights are fixed in ad- vance. Here, the weights are considered as parameters and a suitable hyperprior distribution is specified. This fact leads to great- er freedom and flexibility in the modeling of initial information. In order to obtain the hyperparameter values, a general procedure based on expected Kullback–Leibler divergences is proposed. An advantage is that the process is analytical. General expressions that allow a direct implementation for all distributions in these families are obtained. Nevertheless, other hyperparameter values can be chosen by the reader and used in the subsequent Bayesian analysis. Finally, the expected discrepancies between the combined pos- terior belief over the quantity of interest and each expert’s prior belief are analyzed by using the expected Kullback–Leibler diver- gence between the mixture of the posterior distributions for this quantity and the prior distribution for each expert. A Monte Car- lo-based approach is considered to estimate these values. 0377-2217/$ - see front matter Ó 2010 Elsevier B.V. All rights reserved. doi:10.1016/j.ejor.2010.04.005 * Corresponding author. Address: Departamento de Matemáticas, Escuela Polité- cnica, Universidad de Extremadura, Avda. de la Universidad s/n, 10071 Cáceres, Spain. Tel.: +34 927257220; fax: +34 927257203. E-mail addresses: mrufo@unex.es (M.J. Rufo), carper@unex.es (C.J. Pérez), jrmartin@unex.es (J. Martín). European Journal of Operational Research 207 (2010) 284–289 Contents lists available at ScienceDirect European Journal of Operational Research journal homepage: www.elsevier.com/locate/ejor