Bayes’ Rule, Principle of Indifference, and Safe Distribution Andrzej Piegat 1 and Marek Landowski 1, 2 1 Faculty of Computer Science and Information Systems, Szczecin University of Technology, Zolnierska 49, 71-210 Szczecin, Poland Andrzej.Piegat@wi.ps.pl, mlandowski@wi.ps.pl 2 Quantitative Methods Institute, Szczecin Maritime University, Waly Chrobrego 1-2, 70-500 Szczecin, Poland m.landowski@am.szczecin.pl Abstract. Bayes’ rule is the basis of probabilistic reasoning. It enables to surmount information gaps. However, it requires the knowledge of prior distributions of probabilistic variables. If this distribution is not known then, according to the principle of indifference, the uniform dis- tribution has to be assumed. The uniform distribution is frequently and heavily criticized. The paper presents a safe distribution of probability density that can be often used instead of the uniform distribution to surmount information gaps. According to the authors’ knowledge the concept of the safe distribution is new and unknown in the literature. Key words: Bayes’ rule, principle of indifference, prior probability dis- tribution, Bayes’ networks, automated reasoning. 1 Introduction Human intelligence comprises many skills. The basic skill is the skill of reason- ing, especially under uncertainty. One type of automatic reasoning is the prob- abilistic reasoning. The aim of probabilistic reasoning can be shortly defined as ”to build network models to reason under uncertainty according to the laws of probability theory” [3]. These network models for reasoning are called Bayesian networks, belief networks, and probabilistic networks. An extension of Bayesian networks are called decision networks and enable to calculate decisions under uncertainty. Solving problems under uncertainty (partial lack of knowledge) is one of the most difficult aims of artificial intelligence (AI). People can solve such problems. To make AI comparable with the human intelligence, it also has to be able to solve problems under uncertainty. Problems of information gaps are being intensively investigated at present [4]. An information gap means lack of knowledge of variables values, of distributions of their probability or possibility, of variability intervals, etc. One of ways to surmount information gaps is pro- posed by probability theory in the form of Bayes’ theorem [3] [5]. Let A and B denote two events. The conditional probability p(A\B) is not known. However,