APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY Appl. Stochastic Models Bus. Ind. 2010; 26:448–472 Published online 15 September 2009 in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/asmb.803 Divergences without probability vectors and their applications Athanasios Sachlas and Takis Papaioannou , , Department of Statistics and Insurance Science, University of Piraeus, 185 34 Piraeus, Greece SUMMARY In general, divergences and measures of information are defined for probability vectors. However, in some cases, divergences are ‘informally’ used to measure the discrepancy between vectors, which are not necessarily probability vectors. In this paper we examine whether divergences with nonprobability vectors in their arguments share the properties of probabilistic or information theoretic divergences. The results indicate that divergences with nonprobability vectors share, under some conditions, some of the properties of probabilistic or information theoretic divergences and therefore can be considered and used as information measures. We then use these divergences in the problem of actuarial graduation of mortality rates. Copyright 2009 John Wiley & Sons, Ltd. Received 8 February 2008; Revised 24 July 2009; Accepted 25 July 2009 KEY WORDS: Kullback–Leibler divergence; Cressie–Read divergence; divergence with nonprobability vectors; graduation of mortality rates 1. INTRODUCTION There are many practical problems where nonprobability vectors are involved. One such problem is the actuarial graduation of mortality rates. Although divergences and/or measures of information are defined for probability vectors, in practice they are used with nonprobability vectors as well. The main purpose of this paper is to explore the properties of divergences without probability vectors and provide an application in the actuarial field. A bivariate function D( f , g) of two functions or vectors f and g is a measure of divergence if D( f , g)0 with equality if and only if f = g (see [1]). This is the minimal requirement for a measure D( f , g) to be a ‘kind’ of a distance between f and g. In [2, p. 2] it is mentioned that a coefficient with the property of increasing as the two distributions involved move ‘further from Correspondence to: Takis Papaioannou, Department of Statistics and Insurance Science, University of Piraeus, 185 34 Piraeus, Greece. E-mail: takpap@unipi.gr Part of the work of the second author was done while visiting the Department of Mathematics and Statistics of the University of Cyprus. Copyright 2009 John Wiley & Sons, Ltd.