Section 24.1. Introduction


24.1. Introduction

Changes in consumer attitudes are occurring against the backdrop of two major trends: (1) the evolution of the concept of privacy; and (2) the erosion of historical protections for privacy.

The conception of privacy and, correspondingly, that of informed consent, has evolved through changes in political, legal, economic, social, and technological spheres. In earlier times, privacy (and the security it afforded) existed primarily in relation to physical property. Consequently, protections were given against trespasses of physical property and against battery to a person's body. Liberty meant "freedom from actual restraint."[6] In the late 1800s, advances in photographic technology made it possible to take pictures surreptitiously. Thus, the implicit consent given when "sitting" for a portrait became an inadequate safeguard against the improper capturing of one's portrait for circulation and profit. In reaction to these changes, in a landmark 1890 Harvard Law Review article, "The Right to Privacy,"[7] Samuel Warren and Louis Brandeis urged the courts to recognize an individual's "right to be let alone," to be free from unwarranted intrusions into personal affairs. While Warren and Brandeis were referring primarily to publishable records (e.g., photography and popular media articles), inherent to their plea was the change in meaning of "the right to life," from merely protecting physical property to "the right to enjoy life" spiritually, emotionally, and intellectually. Important for purposes here, the article established a clear relationship between the "right to privacy" and informed consent: that "the right to privacy ceases upon the publication of the facts by the individual, or upon his consent." In addition, having the ability to consentto prevent or allow publicationafforded peace of mind, relief, and freedom from fear of injury.[8]

[6] Samuel D. Warren and Louis D. Brandeis, "The Right to Privacy," Harvard Law Review IV:5 (1890).

[7] Ibid.

[8] Ibid.

Paralleling this evolution in conceptions of privacy, protections for privacy have eroded, in large part brought about by technological advances. For example, in earlier times, the mere format in which personal information was collected (paper) afforded reasonable privacy protection. Inconvenient access to information, limited reproduction capabilities, and the inability to easily link, sort, and process information acted as "natural" shields protecting personal data. Nowadays, networking and digitization have stripped the public of that "natural shielding," and continuously beg for the reconceptualization of privacy and its protection. How is this relevant for today's businesses and for technology design?

Accounting for privacy in technology design does not merely address a moral concern, it safeguards the design host company from financial risks and public relations backlashes. For example, the built-in unique processor-identifier of Intel's Pentium III processor, also known as the Processor Serial Number (PSN) technology, was introduced to help corporations manage inventories.[9] Even though the PSN is a number attached to a processor, not a person, and even though Intel took precautionary actions by creating a control utility that allowed users to turn off the identifier, the introduction of security at the potential expense of privacy generated consumer distrust. Intel was faced with consumer boycotts and legal action attacks from privacy advocacy groups. According to a 1999 report on CNET news,"The serial number has sparked a definite negative emotional core with segments of the population."[10] Other technologies, such as commonplace workstations that contain microphones without hardware on/off switches[11] and the more exotic Active Badge Location system of the Palo Alto Research Center (PARC),[12] are examples of designs that similarly compromised privacy for the sake of other functionalities and that were met with resistance from some groups.

[9] Intel Corporation, "Intel Pentium III Processor" (2003) [cited Dec. 10, 2004]; http://support.intel.com.

[10] Stephanie Miles, "How Serious Is Pentium III's Privacy Risk?" (1999) [cited Dec. 1, 2004]; http://news.com.com/How+serious+is+Pentium+IIIs+privacy+risk/2100-1040_3-222905.html.

[11] John C. Tang, "Eliminating a Hardware Switch: Weighing Economics and Values in a Design Decision," in Batya Friedman (ed.), Human Values and the Design of Computer Technology (Cambridge: CSLI Publications Center for the Study of Language and Information, 1997), 259269.

[12] Roy Want, Andy Hopper, Veronica Falcao, and Jon Gibbons, "The Active Badge Location System," ACM Transactions on Information Systems 10 (1992), 91102.

In this chapter, we first present a conceptual model of informed consent for information systems. Next we present three cases in which the conceptual model has been applied: cookie handling in web browsers; secure connections for web-based interactions; and Google's Gmail web-based email service. Each case discussed here involves a widely deployed technology in use at the time our investigations were conducted; each invokes privacy or security concerns for the public at large; and each highlights a unique set of challenges and design possibilities for informed consent. In reporting on these cases, our goal is not only to articulate how these three specific systems might be improved, but also more generally to illustrate how the model can be used proactively to design information systems that support the user experience of informed consent. Finally, we propose design principles and business practices to enhance privacy and security through informed consent.

VALUE SENSITIVE DESIGN

Value Sensitive Design[a] emerged in the 1990s as an approach to the design of information and computer systems that accounts for human values in a principled and comprehensive manner throughout the design process. While emphasizing the moral perspective (e.g., privacy, security, trust, human dignity, physical and psychological well-being, informed consent, intellectual property), Value Sensitive Design also accounts for usability (e.g., ease of use), conventions (e.g., standardization of technical protocols), and personal predilections (e.g., color preferences within a graphical interface).

Key features of Value Sensitive Design involve its interactional perspective, tripartite methodology, and emphasis on direct and indirect stakeholders:


Interactional theory

Value Sensitive Design is an interactional theory: values are viewed neither as inscribed into technology nor as simply transmitted by social forces. Rather, people and social systems affect technological development, and new technologies shape (but do not rigidly determine) individual behavior and social systems.


Tripartite methodology: conceptual, empirical, and technical

Value Sensitive Design systematically integrates and iterates on three types of investigations. Conceptual investigations comprise philosophically informed analyses of the central constructs and issues under investigation. For example: what values have standing? How should we engage in tradeoffs among competing values (e.g., access versus privacy, or security versus trust)? Empirical investigations focus on the human response to the technical artifact and on the larger social context in which the technology is situated. The entire range of quantitative and qualitative social science research methods may be applicable (e.g., observations, interviews, surveys, focus groups, measurements of user behavior and human physiology, contextual inquiry, and interaction logs). Technical investigations focus on the design and performance of the technology itself, involving both retrospective analyses of existing technologies and the design of new technical mechanisms and systems. The conceptual, empirical, and technical investigations are employed iteratively such that the results of one type are integrated with those of the others, which, in turn, influence yet additional investigations of the earlier types.


Direct and indirect stakeholders

Direct stakeholders are parties who interact directly with the computer system or its output. Indirect stakeholders are all other parties who are otherwise affected by the use of the system. For example, online court record systems impact not only the direct stakeholders, such as lawyers, judges, and journalists who access the court records, but also an especially important group of indirect stakeholders: the people documented in the court records.


[a] Adapted from Batya Friedman, "Value Sensitive Design" in William Sims Bainbridge (ed.), Berkshire Encyclopedia of Human-Computer Interaction (Great Barrington, MA: Berkshire Publishing Group, 2004), 769777.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net