Section 21.2. Faces: (Mis)Managing Ubicomp Privacy


21.2. Faces: (Mis)Managing Ubicomp Privacy

Our investigation into the pitfalls began after we encountered them firsthand while designing Faces , a software prototype for specifying privacy preferences in ubicomp environments.

21.2.1. Faces Design

Ubicomp envisions computation embedded throughout everyday environments to support arbitrary human activities,[3] but the distribution and concealment of displays and sensors can complicate interaction.[4] This can disadvantage users by leaving them unaware of or unable to influence the disclosure of personal informationsuch as location and identityas they go about their activities in augmented environments. To address this, we designed Faces to do the following:

[3] Mark Weiser, "The Computer for the Twenty-First Century," Scientific American 265:3 (1991), 94104.

[4] Victoria Bellotti, Maribeth Back, W. Keith Edwards, Rebecca E. Grinter, Austin Henderson, and Cristina Lopes, "Making Sense of Sensing Systems: Five Questions for Designers and Researchers." Conference on Human Factors in Computing Systems (CHI 2002; Minneapolis, 2002).

  • Support the specification of disclosure preferences, such as who can obtain what information when (see Figure 21-1).

  • Provide feedback about past disclosures in an accessible log, not unlike the financial transaction logs in Quicken and Microsoft Money (see Figure 21-2). Users would employ the feedback in the log to iteratively refine their disclosure preferences over time.

NOTE

Some might object to the Faces disclosure log by claiming that informing the user about a disagreeable disclosure after the fact is too late to be useful. Whitten and Tygar, for example, claim that a so-called "barn door property" governs privacy disclosures: "once a secret has been left accidentally unprotected, even for a short time, there is no way to be sure that it has not already been read by an attacker."[5]

[5] Alma Whitten and J. D. Tygar, "Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0." 8th USENIX Security Symposium (Washington, D.C., 1999). See also Chapter 34, this volume.

While this may apply to highly sensitive disclosures, a significant component of privacy maintenance is the regulation of mundane disclosures over time to influence an observer's historical, evolving impressions of oneself. People are remarkably capable of finessing the consequences of the occasionaland inevitabledisagreeable disclosure, and they learn to minimize repeat occurrences. The Faces disclosure log was intended to help users transfer such iterative behavior refinement to the domain of the sensed environment.

Figure 21-1. The Faces GUI for creating and assigning faces; each face holds information precision preferences for disclosures to the associated inquirer when the user is in the associated situation; in this example, the user is choosing a face to handle inquiries from his roommate whenever he is studying


Figure 21-2. Faces maintains a "disclosure log" that tracks the release of potentially private information; this log allows users to ascertain the characteristics of disagreeable disclosures and refine their preferences to prevent similar disclosures in the future


Later we will show that the design of Faces involved some crucial missteps that are also present in other systems. What clued us in to the fundamental nature of these missteps is that we made them despite a substantive requirements gathering effort (details in Lederer et al.[6]). We reviewed the literature. We interviewed 12 local residents solicited from a public community web site, walking them through a series of scenarios to elicit how they might think about privacy in ubicomp. We surveyed 130 people on the Web to investigate factors that determine privacy preferences in ubicomp.[7] And we iterated through a series of low-fidelity designs. The functional upshot of our findings was that the identity of the inquirer is a primary determinant of users' privacy preferences, but that the situation in which the information is disclosed is also important.

[6] Lederer et al., "Managing Personal Information Disclosure."

[7] Scott Lederer, Jennifer Mankoff, and Anind K. Dey, "Who Wants to Know What When? Privacy Preference Determinants in Ubiquitous Computing." Extended Abstracts of Conference on Human Factors in Computer Systems (CHI 2003; Ft. Lauderdale, FL, 2003).

Accordingly, we designed Faces to let users assign different disclosure preferences to different inquirers, optionally parameterized by situation (a conjunction of location, activity, time, and nearby people). We employed the metaphor of faces to represent disclosure preferences. This is a fairly direct implementation of Goffman, who posited that a person works to present himself to an audience in such a way as to maintain a consistent impression of his role in relation to that audienceto maintain the appropriate face.[8] Prior to any affected disclosures, users employ a desktop application to specify their preferences for subsequent disclosures by creating 3-tuples of inquirers, situations, and faces, with each 3-tuple meaning "if this inquirer wants information about me when I am in this situation, show her this face" (Figure 21-3). Wildcards are allowed in the inquirer and situation slots to handle requests from unregistered inquirers (General Public) or when conditions do not meet the parameters of any registered situations (Default Situation). The preferences established in the desktop module are automatically synchronized with a handheld module that affords in situ feedback and control (Figure 21-4) and that we envisioned would communicate the user's preferences to nearby ubicomp systems in the manner of Langheinrich's Privacy Awareness System.[9]

[8] Erving Goffman, The Presentation of Self in Everyday Life (New York: Doubleday, 1956).

[9] Marc Langheinrich, "A Privacy Awareness System for Ubiquitous Computing Environments." 4th International Conference on Ubiquitous Computing (Ubicomp 2002; Göteborg, Sweden, 2002).

Figure 21-3. Disclosure precision preferencesencapsulated in facesare indexed on a per-inquiry basis, according to the inquirer's identity and the user's situation at the time of inquiry


Figure 21-4. The handheld module affords in situ feedback and control; users can override active preferences and save a snapshot of current contextual variables (e.g., location, time) for subsequent use as a situation parameter; nested menus offer deeper configuration options


Each face alters the disclosed information by specifying the precision at which to disclose it. Faces supports four ordinal levels of precisionfrom Undisclosed (disclose nothing) through Vague through Approximate to Precise (disclose everything). Each face lets the user apply a setting from this scale to each of four information dimensions: identity, location, activity, and nearby people (Figure 21-5). Adjusting the precision of information can desensitize it, allowing for different versions of the same information to reach different inquirers, depending on the situation.[10] For example, a woman might permit her spouse to employ a locator system to determine that she is at her physician's office (precise), but she might prefer that inquisitive friends learn only that she is downtown (vague).

[10] Lederer et al., "Who Wants to Know What When?"

Through its emphasis on inquirers, situations, and precision preferences, Faces operationalizes three of Adams and Sasse's four factors that determine the perception of privacy in richly sensed environments: recipient, context, and sensitivity.[11] We did not directly address the fourth factorusagebecause, as Faces emphasizes a priori preference provisioning, it is often impractical to predict how an observer will use observed information.[12]

[11] Anne Adams and M. Angela Sasse, "Taming the Wolf in Sheep's Clothing: Privacy in Multimedia Communications," 7th ACM International Conference on Multimedia (Orlando, FL, 1999).

[12] Victoria Bellotti and Abigail Sellen, "Design for Privacy in Ubiquitous Computing Environments." 3rd European Conference on Computer Supported Cooperative Work (ECSCW 93; Milano, Italy, 1993).

Figure 21-5. Each face contains disclosure preferences for identity, location, activity, and nearby people


21.2.2. Formative Evaluation

A formative evaluation revealed fundamental problems with the Faces concept (details in Lederer et al.[13]). Flaws in the visual and surface-level interaction design of the software also contributed to negative evaluation results. However, we have been careful to focus our interviews with participants and our resulting analysis on problems rooted in the conceptual model behind the interaction designproblems that even optimal interaction and visual design could not sufficiently overcome.

[13] Lederer, "Managing Personal Information Disclosure."

After a thorough introduction and tutorial, five participants used the system to configure their privacy preferences regarding two inquirers and two situations of their choice. That is, they each created two inquirer entities in the Faces user interface to represent two parties whom they felt would regularly be interested in their location, activity, etc., followed by two situation entities representing situations they often find themselves in, followed by a set of faces encoding the precision preferences they felt comfortable applying to disclosures to those inquirers in those situations. At a minimum, this means they would create a single face to handle both inquirers in both situations; at a maximum, they would create four unique faces, one for each combination of the two inquirers and the two situations. We then described a series of hypothetical but realistic scenarios involving those same inquirers and situations and asked the participants to consider and state the precision levels at which they would prefer to disclose their information to those inquirers in those scenarios.


Note: By scenario we mean a specific activity in a specific context (e.g., buying a pint of chocolate ice cream at the grocery store on Main Street at 10:00 on a Saturday night). We chose our scenarios to be specific, somewhat sensitive events that met the constraints of the more general situations created in the Faces user interface (e.g., shopping during the weekend).

Results showed that participants' a priori configured preferences often differed pointedly from their stated preferences during the scenarios. That is, when confronted with a realistic description of a specific scenario, participants' disclosure preferences differed from what they had previously thought they would be. Further, they had difficulty remembering the precision preferences they had specified inside their faces. This clouded their ability to predict the characteristics of any given disclosure: they might remember the name of the face that would be indexed by the characteristics of a given disclosure, but they would be hard pressed to recall exactly how that face would affect the disclosure.

Subsequent interviews with the participants corroborated these results and also brought the faces metaphor into significant question. Participants expressed discomfort with the indirection between faces and the situations in which they apply. In their minds, a situation and the face one "wears" in it are inseparable; they are, for practical purposes, the same thing.

Together these results illustrate the misstep of separating the privacy management process from the contexts in which it applies. While Faces modeled Goffman's theory in the interface, it inhibited users from practicing identity management through the interface. Users had to think explicitly about privacy in the abstractand instruct the system to model an external representation of their privacy practicesinstead of managing privacy intuitively through their actions in situ.[14]

[14] Leysia Palen and Paul Dourish, "Unpacking 'Privacy' for a Networked World," Conference on Human Factors in Computing Systems (CHI 2003; Fort Lauderdale, FL, 2003).

Having identified these design flaws despite a reasonable design process, we reviewed other privacy-affecting systems in search of similar mistakes. The practicable outcome of this analysis is our description of a set of five pitfalls to beware when designing for personal privacy, presented in the following sections with evidence of designs both succumbing to and avoiding them. After articulating the five pitfalls, we will analyze the Faces system with respect to these pitfalls.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net