Privacy Policies as Decision-Making Tools: An Evaluation of Online Privacy Notices Carlos Jensen, Colin Potts GVU Center, College of Computing The Georgia Institute of Technology Atlanta, GA 30332, USA {carlosj, potts} @cc.gatech.edu +1-404-894-5551 ABSTRACT Studies have repeatedly shown that users are increasingly concerned about their privacy when they go online. In response to both public interest and regulatory pressures, privacy policies have become almost ubiquitous. An estimated 77% of websites now post a privacy policy. These policies differ greatly from site to site, and often address issues that are different from those that users care about. They are in most cases the users’ only source of information. This paper evaluates the usability of online privacy policies, as well as the practice of posting them. We analyze 64 current privacy policies, their accessibility, writing, content and evolution over time. We examine how well these policies meet user needs and how they can be improved. We determine that significant changes need to be made to current practice to meet regulatory and usability requirements. Author Keywords Privacy, WWW, e-commerce, Usability, Consent, Readability ACM Classification Keywords H5.2 [Information Interfaces and Presentation]: User Interfaces – Evaluation, Usability; H5.4 [Information Interfaces and Presentation]: Hypertext/Hypermedia – User Issues INTRODUCTION Studies have repeatedly shown that users are increasingly concerned about their privacy when they go online. In a 2001 survey, 70% of respondents said they worried about their online privacy [9]. In a separate study, 69% said that they were “concerned about [online] privacy invasions and try to take action to prevent them from happening to [them]” [5]. This concern may not be unfounded. According to a recent study (91%) of U.S. Web sites collect personal information and 90% collect personally identifying information [1]. In response to public interest and regulatory pressures, privacy policies have become almost ubiquitous. The Progress and Freedom Foundation recently surveyed a sample of highly visited websites and found that 77% of those websites posted a privacy policy [1]. Website privacy policies are meant to inform consumers about business and privacy practices and serve as a basis for decision making for consumers. Not only are privacy policies important for decision making, they are often the only source of information. Policies therefore present an important challenge in terms of HCI; how to convey a lot of complicated but critical information without overwhelming users. We know there are several common problems with policies today, including a frequent mismatch between the issues companies wish to address in their policies, and what users want to know about business practices. Part of the reason for this, and why privacy policies differ greatly from site to site is a lack regulation or industry standards. This applies both in terms of the language used in the policies and the issues they address. This lack of standardization makes it difficult to compare and contrast policies, thereby decreasing their value to users. This issue of standards and regulations is slowly changing as different industries have become more tightly regulated in terms of privacy (e.g. Healthcare through the Health Insurance Portability and Accountability Act of 1996 (HIPAA) [15], finance through the Gramm-Leach-Bliley Act of 1999 (GLBA) [14], and the Children’s Online Privacy Protection Act of 1998 (COPPA) [13] for children). Industry standards have also emerged in the form of privacy certification services, also known as “privacy seals.” These are run either by independent companies or Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. CHI 2004, April 24–29, 2004, Vienna, Austria. Copyright 2004 ACM 1-58113-702-8/04/0004…$5.00. CHI 2004 ʜ Paper 24-29 April ʜ Vienna, Austria Volume 6, Number 471 Economics of Information Security Privacy and Rationality in Individual Decision Making Alessandro Acquisti Carnegie Mellon University Jens Grossklags University of California, Berkeley Traditional theory suggests consumers should be able to manage their privacy. Yet, empirical and theoretical research suggests that consumers often lack enough information to make privacy-sensitive decisions and, even with sufficient information, are likely to trade off long-term privacy for short-term benefits. To appear in: IEEE Security & Privacy, January/February 2005, pp. 24-30. From its early days1,2 to more recent incarnations, economic studies of privacy have viewed individuals as rational economic agents who go about deciding how to protect or divulge their personal information. According to that view, individuals are forward lookers, utility maximizers, Bayesian updaters who are fully informed or base their decisions on probabilities coming from known random distributions. (Some recent works3,4 contrast myopic and fully rational consumers, but focus on the latter.) This approach also permeates the policy debate, in which many believe not only that individuals and organizations should have the right to manage privacy trade-offs without regulative intervention, but that individuals can, in fact, use that right in their own best interest. However, although several empirical studies have reported growing privacy concerns across the US population,5,6 recent surveys, anecdotal evidence, and experiments7–10 have highlighted an apparent dichotomy between privacy attitudes and actual behavior. First, individuals are willing to trade privacy for convenience or bargain the release of personal information in exchange for relatively small rewards. Second, individuals are seldom willing to adopt privacy protective technologies. Our research combines theoretical and empirical approaches to investigate the drivers and apparent inconsistencies of privacy decision making and behavior. We present the theoretical groundings to critique A Comparative Study of Online Privacy Polic and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley,1 Lorrie Faith Cranor1 1 Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA AUTHORS PRE-PRESS VERSION Please cite to the published paper available from: http://www.springer.de/comp/lncs/index.html Abstract. Online privacy policies are di⇤cult to understand. Most pri- vacy policies require a college reading level and an ability to decode legalistic, confusing, or jargon-laden phrases. Privacy researchers and in- dustry groups have devised several standardized privacy policy formats to address these issues and help people compare policies. We evaluated three formats in this paper: layered policies, which present a short form with standardized components in addition to a full policy; the Privacy Finder privacy report, which standardizes the text descriptions of privacy practices in a brief bulleted format; and conventional non-standardized human-readable policies. We contrasted six companies’ policies, delib- erately selected to span the range from unusually readable to challeng- ing. Based on the results of our online study of 749 Internet users, we found participants were not able to reliably understand companies’ pri- vacy practices with any of the formats. Compared to natural language, participants were faster with standardized formats but at the expense of accuracy for layered policies. Privacy Finder formats supported accuracy more than natural language for harder questions. Improved readability scores did not translate to improved performance. All formats and poli- cies were similarly disliked. We discuss our findings as well as public policy implications. Funded by NSF Cyber Trust grant CNS-0627513, Microsoft through the Car Mellon Center for Computational Thinking, Army Research O⇤ce grant ber DAAD19-02-1-0389 to Carnegie Mellon CyLab, and FCT through CMU/Portugal Information and Communication Technologies Institute. Than Robert McGuire and Keisha How for programming assistance.