INFORMATION SCIENCES 65,253-273 (1992) 253 Measures of Uncertainty and Information in Computation* EDWARD W. PACKEL Department of Mathematics, and Lake Forest College, Lake Forest, Illinois 60045 J. F. TRAUB Computer Science Department, Columbia University, New York, New York 10027 and HENRYK WOiNIAKOWSKI Computer Science Department, Columbia University, New York, New York 10027, and Institute of Informatics, University of Warsaw, Warsaw, Poland Communicated by Stephen S. Yau ABSTRACT Working within the framework of information-based complexity, a branch of theoretical computer science that studies the intrinsic difficulty of solving problems having incomplete information, we introduce a new concept for measuring information called the value of information. This number is defined in terms of the radius of information, a central concept of information-based complexity. The value of information is compared with the entropy- based concept of mutual information as defined in information theory. The two measures of information are shown to agree in certain cases and differ in others. We focus on the average case setting of information-based complexity, noting that the radius of information and hence the value of information can be defined in a variety of other settings. 1. INTRODUCTION The importance of quantifying the concept of information and of measuring its value is evidenced by the variety of scientific disciplines that have made significant contributions to this problem. Fascination with a notion of entropy to represent disorder, uncertainty, or loss of information has been a main *This research was supported in part by the National Science Foundation under Grant M-89-072 15. OElsevier Science Publishing Co., Inc. 1992 655 Avenue of the Americas, New York, NY 10010 0020-0255 /92/$5 .OO