Section 21.4. Discussion


21.4. Discussion

Having described the five pitfalls and provided evidence of systems that fall into and avoid them, we now examine some of the deeper implications they have for design. We begin by elaborating on the influence of our first two pitfalls on the user's mental model of his information trajectories. This leads to the introduction of a new conceptual tool to help the design process. Then we present an analytical argument for why designs that avoid our five pitfalls can support the human processes of understanding and action necessary for personal privacy maintenance. Using our Faces prototype as a case study, we then show how falling into these pitfalls can undermine an otherwise ordinary design process. Finally we discuss some successful systems that have largely avoided the pitfalls.

21.4.1. Mental Models of Information Flow

As we said earlier, avoiding our first two pitfallsobscuring potential and actual information flowcan clarify the extent to which users' actions engage the system's range of privacy implications. Users can understand the consequences of their use of the system thus far, and they can predict the consequences of future use.

Illuminating disclosure contributes constructively to the user's mental model of the portrayal of her identity and behavior in the context of the system. If she has a reasonable understanding of what observers can learn about her (Pitfall 1) and of what they already know about her (Pitfall 2), she can maintain and exploit this mental model to influence the portrayal of her identity and associated activities over time.

In the context of interactive systems, the personal information a user conveys is often tightly integrated with her interaction with the system. For example, by simply browsing the Web, a user generates a rich clickstream that can be used by observers in ways that directly impact her life. When interaction and disclosure are integrated in such a way, an informed user's mental model of the system's operation and her mental model of her disclosures are interdependent.

This suggests an extension to Norman's canonical elucidation of the role of mental models in the design process. According to Norman, the designer's goal is to design the system image (i.e., those aspects of the implementation with which the user interacts) such that the user's mental model of the system's operation coincides with the designer's mental model of the same.[65]

[65] Donald A. Norman, The Design of Everyday Things (New York: Basic Books, 1988).

When we take into account the coupling of interaction and disclosure, we see that the designer's goal has expanded. She now strives to design the system image such that the user's mental models of the system's operation and of the portrayal of his identity and behavior through it are both accurate. As with Norman's original notion, ideally the designer's and the user's models of the system's operation will coincide. But the designer generally cannot have a model of the user's personal information; that depends on the user and the context of use. Indeed, here the designer's task is not to harmonize the user's model of his information flow with her own (she likely has none), but to harmonize the user's information model with the observer's (Figure 21-7). In other words, she wants to design the system image to accurately convey a model not only of how other parties can observe the user's behavior through the system, but also what they can and do observe.

Figure 21-7. Building on Norman's elucidation of the role of mental models in the design process, designers can aim to harmonize the user's and the observer's understanding of the user's personal information disclosures


21.4.2. Opportunities for Understanding and Action

We have argued that people maintain personal privacy by understanding the privacy implications of their sociotechnical contexts and influencing them through socially meaningful action. When a technical system is embedded into a social process, the primary means its designers have to engender understanding and action are its feedback and control mechanisms. We encourage designers of privacy-affecting systems to think of feedback and control mechanisms as opportunities for understanding and action. They are the designer's opportunity to empower those processes, and they are the user's opportunity to practice them.

This orientation can help designers reach across what Ackerman calls the sociotechnical gapthe difference between systems' technical capabilities and their social requirements[66]just enough to empower informed social action. The challenge is to find that intermediate point where carefully designed technical feedback and control translates into social understanding and action. Reaching too far can overwhelm the user. Not reaching far enough can disempower him.

[66] Ackerman.

We believe that avoiding the pitfalls can help designers reach that intermediate point. Carefully designed feedback about potential and actual information flow can help users understand the representation and conveyance of their behavior through the system. Curtailing configuration, providing coarse-grained control, and supporting established practices can help people make productive, intuitive use of a privacy-affecting system. Designs that heed these suggestions make their consequences known and do not require great effort to use, helping people incorporate them meaningfully into their everyday privacy practices.

21.4.3. Negative Case Study: Faces

We return now to Facesour prototypical ubicomp privacy UIas a case study in how to fall into the following pitfalls.


Obscuring potential information flow

In trying to be a UI for managing privacy across any ubicomp system, Faces abstracted away the true capabilities of any underlying system. Users could not gauge its potential information flow because it aimed to address all information flow. Its scope was impractically broad and effectively incomprehensible.


Obscuring actual information flow

Faces conveyed actual information flow through the disclosure log. Each record was accessible after the relevant disclosure. While this design intends to illuminate information flow, it is unclear whether postponing notice is optimal. Embedding notice directly into the real-time experience of disclosure might foster a stronger understanding of information flow.


Emphasizing configuration over action

Faces required a considerable amount of configuration. Once configuration was done, and assuming it was done correctly, the system was designed to require little ad hoc configuration. The user would simply go about his business. But the sheer amount and desituated nature of configuration severely limited the system's chances of operating in alignment with the user's in situ preferences, positioning Faces squarely in this pitfall.


Lacking coarse-grained control

Faces avoided this pitfall by including an override function that afforded quick transitions to alternate faces.


Inhibiting established practice

While Faces modeled the nuance of Goffman's identity management theory, it appeared to hinder its actual practice by requiring the user to maintain virtual representations of his fragmented identities in addition to manifesting them naturally through intuitive, socially meaningful behavior.

Our evaluation of Faces revealed a complex, abstract configuration requirement that belies the intuitive practice of privacy in real settings. Faces also aimed to singularly address privacy needs across an arbitrary range of ubicomp systems and information types, a task whose futility becomes apparent upon recognizing that privacy management extends across systems, involving fluid, heterogeneous assemblies of technologies, practices, and information types.

21.4.4. Positive Case Study: Instant Messaging and Mobile Telephony

Interestingly, two systems that largely avoid our pitfallsmobile phones and instant messaging (IM)are primarily communication media. That is, disclosure is their central function. We will briefly assess these services against the pitfalls, focusing on their primary functionstextual and vocal communicationand on some of their secondary features that support these functions. We will not address orthogonal, controversial features like the location-tracking capabilities of some mobile phones and the capture of IM sessions, which would have to be addressed by a more robust assessment of the privacy implications of these technologies.

IM and mobile telephony each make clear the potential and actual flow of disclosed information, making for a robust, shared mental model of information flow through these cooperative interactive systems. Potential flow is scoped by features like Caller ID (telephony), Buddy Lists (IM), and feedback about the user's own online presence (IM). Actual flow is largely self-evident in the contents of the communications. Each technology requires minimal configuration for maintaining privacy (although secondary features often require excessive configuration), largely due to coarse-grained controls for halting and resuming information flowfor example, invisible mode (IM), application exit (IM), power button (telephony), and ringer volume (telephony). Finally, each supports existing practices of plausible deniabilitypeople can choose to ignore incoming messages and calls without having to explain whyand ambiguous disclosure; the linguistic nature of each medium allows for arbitrary customization of disclosed information.[67], [68]

[67] Nardi, Whittaker, and Bradner.

[68] Woodruff and Aoki.

Indeed, communication media might serve as a model for designing other privacy-affecting systems not conventionally categorized as communication technologies. Disclosure is communication, whether it results from the use of a symmetric linguistic medium (e.g., telephony) or an asymmetric event-based medium (e.g., e-commerce, context-aware systems). Systems that affect privacy but are not positioned as communication media do nonetheless communicate personal information to observers. Exposing and addressing these disclosure media as communication media might liberate designs to leverage users' intuitive privacy maintenance skills.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net