arXiv:physics/0403031v1 [physics.data-an] 3 Mar 2004 Deriving Laws from Ordering Relations Kevin H. Knuth Computational Sci. Div., NASA Ames Research Ctr., M/S 269-3, Moffett Field CA 94035 Abstract. The effect of Richard T. Cox’s contribution to probability theory was to generalize Boolean implication among logical statements to degrees of implication, which are manipulated using rules derived from consistency with Boolean algebra. These rules are known as the sum rule, the product rule and Bayes’ Theorem, and the measure resulting from this generalization is probability. In this paper, I will describe how Cox’s technique can be further generalized to include other algebras and hence other problems in science and mathematics. The result is a methodology that can be used to generalize an algebra to a calculus by relying on consistency with order theory to derive the laws of the calculus. My goals are to clear up the mysteries as to why the same basic structure found in probability theory appears in other contexts, to better understand the foundations of probability theory, and to extend these ideas to other areas by developing new mathematics and new physics. The relevance of this methodology will be demonstrated using examples from probability theory, number theory, geometry, information theory, and quantum mechanics. INTRODUCTION The algebra of logical statements is well-known and is called Boolean algebra [1, 2]. There are three operations in this algebra: conjunction , disjunction , and comple- mentation . In terms of the English language, the logical operation of conjunction is implemented by the grammatical conjunction ‘and’, the logical operation of disjunc- tion is implemented by the grammatical conjunction ‘or’, and the logical complement is denoted by the adverb ‘not’. Implication among assertions is defined so that a logical statement a implies a logical statement b, written a b, when a b = b or equivalently when a b = a. These are the basic ideas behind Boolean logic. The effect of Richard T. Cox’s contribution to probability theory [3, 4] was to general- ize Boolean implication among logical statements to degrees of implication represented by real numbers. These real numbers, which represent the degree to which we believe one logical statement implies another logical statement, are now recognized to be equiv- alent to probabilities. Cox’s methodology centered on deriving the rules to manipulate these numbers. The key idea is that these rules must maintain consistency with the un- derlying Boolean algebra. Cox showed that the product rule derives from associativity of the conjunction, and that the sum rule derives from the properties of the complement. Commutativity of the logical conjunction leads to the celebrated Bayes’ Theorem. This set of rules for manipulating these real numbers is not one of set of many possible rules; it is the unique generalization consistent with the properties of the Boolean algebraic structure. Boole recognized that the algebra of logical statements was the same algebra that described sets [1]. The basic idea is that we can exchange ‘set’ for ‘logical statement’,