Protecting Privacy with Economics: Economic Incentives for Preventive Technologies in Ubiquitous Computing Environments Alessandro Acquisti * UC Berkeley Workshop on Socially-informed Design of Privacy-enhancing Solutions in Ubiquitous Computing Ubicomp 2002 Abstract Ubiquitous computing environments make the economic analysis of privacy more difficult as they exacerbate information asymmetries and uncertainties. This paper discusses why the actual marketization of privacy is more difficult than its technical protection in these environments. It then focuses on the economic incentives that can justify the adoption of preventive privacy enhancing technologies. 1 Privacy is Easy, Economics is difficult? Surveys have repeatedly identified privacy as one of the most pressing concerns of those using new information technology. 1 Only in terms of Internet sales, billions of dollars are said to be lost every year because of privacy fears. 2 At the same time, academic research and industry efforts have developed protocols and technologies to protect individuals’ privacy in almost any conceivable scenario - from browsing the Internet to purchasing on- and off-line. There is a demand, and there is an offer. So, why is there no market clearing? This paper takes an economic approach to the study of why privacy enhancing technologies have failed to gain widespread adoption, while privacy and security of personal information have remained a concern for many. It is clear that the economic incentives have failed to generate alone workable solutions: it seems like privacy is more difficult to sell than to protect. It could be that economic incentives find so much difficulty in causing technology adoption because privacy itself is a difficult concept to define in economic terms. One might delimit the privacy conundrum by referring to it as the relation between a subject ; some information related to that subject (and possibly a transaction that subject might be participating to); and a set of other parties (that might or might not interact directly with that subject) that might have an interest in (or access to, or use of, or some other relation with) that information. As already highlighted in the economic literature (see e.g. Varian [1996]), the subject and the other parties are often in positions of information asymmetry with respect to what use will be made of that information. For example, a customer might not know how the merchant will use the information that she has just provided to him on his website. This creates problems to the economic analysis of privacy scenarios and to the design of appropriate economic incentives. The subject might not know if, when, and how often the information she is providing will be used. In addition, she might not know what damage she will incur because of that information becoming known, she might not know how much profit others will make thanks to that information, or she might not know the benefits she will forego if her privacy will be violated. In extreme cases the subject might not even be aware of the fact that she is revealing information. One might picture price discrimination as a privacy issue, where economic agents might be revealing their “type,” or preferences. “Myopic” * Mail: SIMS, 102 South Hall, Berkeley, CA, 94720. Email: acquisti@sims.berkeley.edu. 1 Privacy surveys are too numerous to be cited here. The Electronic Privacy Information Center (http://www.epic.org/) maintains archives of privacy related news and links to privacy surveys. 2 See, for example, Federal Trade Commission (2000). 1