Section 21.3. Five Pitfalls to Heed When Designing for Privacy


21.3. Five Pitfalls to Heed When Designing for Privacy

Our pitfalls encode common problems in interaction design across several systems, constituting a preventative guide to help designers avoid mistakes that may appear obvious in retrospect but that continue to be made nonetheless. We encourage designers to carefully heed the pitfalls throughout the design cycle. Naturally, they will apply in different ways and to different degrees for each system. They should be interpreted within the context of the design task at hand.

The pitfalls fit into a history of analyses and guidelines on developing privacy-sensitive systems. They are, in part, an effort to reconcile Palen and Dourish's theoretical insights about how people practice privacy[15] with Bellotti and Sellen's guidelines for designing feedback and control to support it.[16] In reaching for this middle ground, we have tried to honor the fair information practicesas developed by Westin[17] and more recently adapted to the ubicomp design space by Langheinrich[18]and to encourage minimum information asymmetry between subjects and observersas argued by Jiang, Hong, and Landay.[19]

[15] Ibid.

[16] Bellotti and Sellen.

[17] Alan Westin, Privacy and Freedom (New York: Atheneum, 1967).

[18] Marc Langheinrich, "Privacy by DesignPrinciples of Privacy-Aware Ubiquitous Systems," 3rd International Conference on Ubiquitous Computing (Ubicomp 03; Atlanta, 2001).

[19] Xiaodong Jiang, Jason I. Hong, and James A. Landay, "Approximate Information Flows: Socially Based Modeling of Privacy in Ubiquitous Computing," 4th International Conference on Ubiquitous Computing (Ubicomp 02; Göteborg, Sweden, 2002).

21.3.1. Concerning Understanding

Avoiding our first two pitfalls can help fortify the user's understanding of a system's privacy implications by illuminating the system's potential for information disclosure in the future and the actual disclosures made through it in the present and past.

21.3.1.1 Pitfall 1: Obscuring potential information flow

To whatever degree is reasonable, systems should make clear the nature and extent of their potential for disclosure. Users will have difficulty appropriating a system into their everyday practices if the scope of its privacy implications is unclear. This scope includes:

  • The types of information the system conveys

  • The kinds of observers to which it conveys information

  • The media through which information is conveyed

  • The length of retention

  • The potential for unintentional disclosure

  • The presence of third-party observers

  • The collection of meta-information like traffic analysis

Clarifying a system's potential for conveying personal information is vital to users' ability to predict the social consequences of its use.

Among the conveyable information types to elucidate are identifiable personae (e.g., true names, login names, email addresses, credit card numbers, Social Security numbers) and monitorable activities (broadly, any of the user's interpretable actions and/or the contexts in which they are performed, such as locations, purchases, clickstreams, social relations, correspondences, and audio/video records). This dichotomy of personae and activities, although imperfect and coarse, can be useful shorthand for conceptualizing a user's identity space, with personae serving as indices to dynamically intersecting subspaces and activities serving as the contents of those subspaces.[20] People work to maintain consistency of character with respect to a given audience, in effect ensuring that an audience cannot access an identity subspace to which it does not already have an index. This can require considerable effort because boundaries between subspaces are fluid and overlapping. Conveying evidence of activity out of character with the apposite persona can rupture the carefully maintained boundaries between identity subspaces, collapsing one's fragmented identities and creating opportunities for social, bodily, emotional, and financial harm.[21]

[20] danah boyd, "Faceted Id/Entity: Managing Representation in a Digital World," M.S. Thesis, Massachusetts Institute of Technology, 2002.

[21] David J. Phillips, "Context, Identity, and Privacy in Ubiquitous Computing Environments," Workshop on Socially Informed Design of Privacy-Enhancing Solutions in Ubiquitous Computing, Ubicomp 2002 Conference (Göteborg, Sweden, 2002).

Privacy-affecting systems tend to involve disclosure both between people and between a person and an organization. Designs should address the potential involvement of each, clarifying if and how primarily interpersonal disclosures (e.g., chat) involve incidental organizational disclosures (e.g., workplace chat monitoring) and, conversely, if and how primarily organizational disclosures (e.g., workplace cameras) involve secondary interpersonal disclosures (e.g., mediaspaces).

Privacy is a broad term whose unqualified use as a descriptor can mislead users into thinking that a system protects or erodes privacy in ways it does not. Making the scope of a system's privacy implications clear will help users understand its capabilities and limits. This, in turn, provides grounding for comprehending the actual flow of information through the system, addressed in pitfall 2, described in the next section.

21.3.1.2 Evidence: Falling into the pitfall

An easy way to obscure a system's privacy scope is to present its functionality ambiguously. One example is Microsoft's Windows operating systems, whose Internet control panel offers ordinal degrees of privacy protection (from Low to High, as shown in Figure 21-6). First, the functional meaning of this scale is unclear to average users. Second, despite being a component of the operating system's control panel, this mechanism does not control general privacy for general Internet use through the operating system; its scope is limited only to a particular web browser's cookie management heuristics.

Similarly, Anonymizer.com's free anonymizing software can give the impression that all Internet activity is anonymous when the service is active, but in actuality it affects only web browsing, not email, chat, or other services. A for-pay version covers those services.

Another example is found in Beckwith's report of an eldercare facility that uses worn transponder badges to monitor the locations of residents and staff.[22] Many residents perceived the badge only as a call-button (which it was), but not as a persistent location tracker (which it also was). They did not understand the disclosures it was capable of facilitating.

[22] Richard Beckwith, "Designing for Ubiquity: The Perception of Privacy," IEEE Pervasive 2:2 (2003), 4046.

Figure 21-6. The Privacy tab of Internet Explorer's Internet Options control panel offers ordinal degrees of privacy protection (from Low to High), but most users do not understand what the settings actually mean


Similarly, some hospitals use badges to track the location of nurses for efficiency and accountability purposes but neglect to clarify what kind of information the system conveys. Erroneously thinking the device was also a microphone, one concerned nurse wrote, "They've placed it in the nurses' lounge and kitchen. Somebody can click it on and listen to the conversation. You don't need a Big Brother looking over your shoulder."[23]

[23] Putsata Reang, "Dozens of Nurses in Castro Valley Balk at Wearing Locators" Mercury News (Sept. 6, 2002).

A recent example of a privacy-affecting system that has given ambiguous impressions of its privacy implications is Google's Gmail email system. Gmail's content-triggered advertisements have inspired public condemnation and legal action over claims of invading users' privacy.[24] Some critics may believe that Google discloses email content to advertiserswhich Gmail's architecture prohibitswhile some may simply protest the commercial exploitationautomated or notof the content of personal communications. Despite publishing a conspicuous and concise declaration on Gmail's home page that "no email content or other personally identifiable information is ever provided to advertisers,"[25] the privacy implications of Gmail's use were unclear to many users when it launched. Equally unclear, however, is whether the confusion could have been avoided, since other factors beyond system and interaction design were in play. In particular, Google's idiosyncratic brand prominence and reputation for innovation, catalyzed by Gmail's sudden appearance, ensured an immediateand immediately criticalmarket of both sophisticated and naïve users.

[24] Lisa Baertlein, "Calif. Lawmaker Moves to Block Google's Gmail," Reuters (Apr. 12, 2004).

[25] Google, "About Gmail" [accessed Jan. 13, 2005]; http://gmail.google.com/gmail/help/about.html.

21.3.1.3 Evidence: Avoiding the pitfall

Many web sites that require an email address for creating an account give clear notice on their sign-up forms that they do not share email addresses with third parties or use them for extraneous communication with the user. Clear, concise statements like these help clarify scope and are becoming more common.

Tribe.net is a social networking service that carefully makes clear that members' information will be made available only to other members within a certain number of degrees of social separation. Of course, this in no way implies that users' privacy is particularly safeguarded, but it does make explicit the basic scope of potential disclosures, helping the user understand her potential audience.

21.3.1.4 Pitfall 2: Obscuring actual information flow

Having addressed the user's need to understand a system's potential privacy implications, we move now to instances of actual disclosure. To whatever degree is reasonable, designs should make clear the actual disclosure of information through the system. Users should understand what information is being conveyed to whom. The disclosure should be obvious to the user as it occurs; if this is impractical, notice should be provided within a reasonable timeframe. Feedback should sufficiently inform, but not overwhelm, the user.

By avoiding both this and the prior pitfall, designs can clarify the extent to which users' actions engage the system's range of privacy implications. This can help users understand the consequences of their use of the system thus far, and predict the consequences of future use. In the "Discussion" section, we will elaborate on how avoiding both of these pitfalls can support the user's mental model of his personal information flow.

We will not dwell on this pitfall, for it is perhaps the most obvious of the five. We suggest Bellotti and Sellen (1993) as a guide to exposing actual information disclosure.

21.3.1.5 Evidence: Falling into the pitfall

Web browser support for cookies is a persistent example of obscuring information flow.[26] Most browsers do not, by default, indicate when a site sets a cookie or what information is disclosed through its use. The prevalence of third-party cookies and web bugs (tiny web page images that facilitate tracking) exacerbates users' ignorance of who is observing their browsing activities.

[26] Lynette I. Millett, Batya Friedman, and Edward Felten, "Cookies and Web Browser Design: Toward Realizing Informed Consent Online," Conference on Human Factors in Computing Systems (CHI 2001; Seattle, 2001). See also Chapter 24, this volume.

Another example of concealed information flow is in the KaZaA P2P file-sharing application, which has been shown to facilitate the concealed disclosure of highly sensitive personal information to unknown parties.[27]

[27] Nathaniel S. Good and Aaron Krekelberg, "Usability and Privacy: A Study of KaZaA P2P File-Sharing." Conference on Human Factors in Computing Systems (CHI 2003; Ft. Lauderdale, FL, 2003). See also Chapter 33, this volume.

Another example is worn locator badges like those described in Harper et al.[28] and Beckwith,[29] which generally do not inform their wearers about who is locating them.

[28] R. H. R. Harper, M. G. Lamming, and W. H. Newman, "Locating Systems at Work: Implications for the Development of Active Badge Applications," Interacting with Computers 4:3 (1992), 343363.

[29] Beckwith.

21.3.1.6 Evidence: Avoiding the pitfall

Friedman et al.'s redesign of cookie management reveals what information is disclosed to whom. They extended the Mozilla web browser to provide prominent visual feedback about the real-time placement and characteristics of cookies, thereby showing users what information is being disclosed to what web sites.[30]

[30] Batya Friedman, Daniel C. Howe, and Edward W. Felten, "Informed Consent in the Mozilla Browser: Implementing Value-Sensitive Design," 35th Annual Hawaii International Conference on System Sciences (HICSS 02; Hawaii, Jan. 2002). See also Chapter 24, this volume.

Some instant messaging systems employ a symmetric design that informs the user when someone wants to add that user to a contact list, allowing him to do the same. This way, he knows who is likely to see his publicized status. Further, his status is typically reflected in the user interface, indicating exactly what others can learn about him by inspecting their buddy lists.

AT&T's mMode Find People Nearby service, which lets mobile phone users locate other users of the service, informs the user when someone else is locating him. He learns who is obtaining what information.

21.3.2. Concerning Action

Our last three pitfalls involve a system's ability to support the conduct of socially meaningful action. Instead of occurring through specific configurations of technical parameters within a system, everyday privacy regulation often occurs through the subtle manipulation of coarse controls across devices, applications, artifacts, and time. In other words, people manage privacy through regularly reassembled metasystems of heterogeneous media,[31] with observers discerning socially meaningful actions through the accumulation of evidence across these media. Privacy-sensitive technical systems can help users intuitively shape the nature and extent of this evidence to influence the social consequences of their behavior.

[31] Matthew Chalmers and Areti Galani, "Seamful Interweaving: Heterogeneity in the Theory and Design of Interactive Systems," Designing Interactive Systems (DIS 2004; Cambridge, MA, 2004).

21.3.2.1 Pitfall 3: Emphasizing configuration over action

Designs should not require excessive configuration to create and maintain privacy. They should enable users to practice privacy management as a natural consequence of their ordinary use of the system.

Palen and Dourish write:

But because configuration has become a universal user interface design pattern, many systems fall into the configuration pitfall.

Configured privacy breaks down for at least two reasons. First, in real settings, users manage privacy semi-intuitively; they do not spell out their privacy needs in an auxiliary, focused effort.[33] Configuration imposes an awkward requirement on users, one they will often forsake in favor of default settings.[34], [35] If users are to manage their privacy at all, it needs to be done in an intuitive fashion, as a predictable outcome of their situated actions involving the system. As noted by Cranor in Chapter 22 of this volume, "most people have little experience articulating their privacy preferences."

[33] Whitten and Tygar.

[34] Leysia Palen, "Social, Individual & Technological Issues for Groupware Calendar Systems," Conference on Human Factors in Computing Systems (CHI 99; Pittsburgh, PA, 1999).

[35] Wendy E. Mackay, "Triggers and Barriers to Customizing Software," Conference on Human Factors in Computing Systems (CHI 91; New Orleans, 1991).

A second reason that configured privacy breaks down is that the act of configuring preferences is too easily desituated from the contexts in which those preferences apply. Users are challenged to predict their needs under hypothetical circumstances, removed from the intuitive routines and disruptive exceptions that constitute the real-time real world. If they predict wrongly during configuration, their configured preferences will differ from their in situ needs, creating the conditions for an invasion of privacy.

People generally do not set out to explicitly protect their privacy, an example of Whitten and Tygar's "unmotivated user" property.[36] People do not sit down at their computers to protect their privacy (or, in Whitten's case, to manage their security). Rather, they participate in some activity, with privacy regulation being an embedded component of that activity. Designs should take care not to extract the privacy regulation process from the activity within which it is normally conducted.

[36] Whitten and Tygar.

21.3.2.2 Evidence: Falling into the pitfall

An abundance of systems emphasize explicit configuration of privacy, including experimental online identity managers,[37], [38] P2P file-sharing software,[39] web browsers,[40] and email encryption software.[41] In the realm of ubiquitous computing, both our Faces prototype and Bell Labs's Houdini Project[42] require significant configuration efforts prior to and after disclosures.

[37] boyd, 2002.

[38] Uwe Jendricke and Daniela Gerd tom Markotten, "Usability Meets Securitythe Identity-Manager As Your Personal Security Assistant for the Internet," 16th Annual Computer Security Applications Conference (ACSAC 00; New Orleans, Dec. 2000).

[39] Good and Krekelberg.

[40] Millett et al.

[41] Whitten and Tygar.

[42] Richard Hull, Bharat Kumar, Daniel Lieuwen, Peter Patel-Schneider, Arnaud Sahuguet, Sriram Varadarajan, and Avinash Vyas, "Enabling Context-Aware and Privacy-Conscious User Data Sharing," IEEE International Conference on Mobile Data Management (MDM 2004; Berkeley, CA, 2004).

21.3.2.3 Evidence: Avoiding the pitfall

Successful solutions might involve some measure of configuration but tend to embed it into the actions necessary to use the system. Web sites like Friendster.com and Tribe.net allow users to regulate information flow by modifying representations of their social networksa process that is embedded into the very nature of these applications.

Dodgeball.com's real-time sociospatial networking service also directly integrates privacy regulation into the primary use of the system. Dodgeball members socially advertise their location by sending brief, syntactically constrained text messages from their mobile devices to Dodgeball's server, which then sends an announcement to the member's friendsand friends of friends within walking distance. Identifying one's friends to the system does require specific configuration effort, but once done, regulating location privacy is integrated with the very use of the system. Each use actively publicizes one's location; concealing one's location simply involves not using the system.

Georgia Tech's In/Out Board lets users reveal or conceal their presence in a workspace by badging into an entryway device.[43] Its purpose is to convey this information, but it can be intuitively used to withhold information as well, by falsely signaling in/out status with a single gesture.

[43] A. K. Dey, D. Salber, and G. D. Abowd, "A Conceptual Framework and a Toolkit for Supporting the Rapid Prototyping of Context-Aware Application," Human-Computer Interaction 16:2-4 (2001), 97166.

Cadiz and Gupta propose a smart card that one could hand to a receptionist to grant him limited access to one's calendar to schedule an appointment; he would hand it back immediately afterward. No one would have to fumble with setting permissions. They also suggest extending scheduling systems to automatically grant meeting partners access to the user's location during the minutes leading up to a meeting, so they can infer his arrival time. The action of scheduling a meeting would imply limited approval of location disclosure.[44]

[44] J. J. Cadiz and Anoop Gupta, "Privacy Interfaces for Collaboration," Technical Report Msr-Tr-2001-82 (Redmond, WA: Microsoft Corporation, 2001).

21.3.2.4 Pitfall 4: Lacking coarse-grained control

Designs should offer an obvious, top-level mechanism for halting and resuming disclosure. Users are accustomed to turning a thing off when they want its operation to stop. Often a power button or exit button will do the trick.

Beyond binary control, a simple ordinal control may also be appropriate in some cases (e.g., audio devices' volume and mute controls). Ubicomp systems that convey location or other context could incorporate both a precision dial (ordinal) and a hide button (binary), so users can either adjust the precision at which their context is disclosed or decidedly halt disclosure.

In the general case, users can become remarkably adept at wielding coarse-grained controls to yield nuanced results.[45] Individuals can communicate significant privacy preferences by leaving an office door wide open or ajar, capping or uncapping a camera lens,[46], [47] or publicly turning off their cell phone at the start of a meeting. Coarse-grained controls are frequently easy to engage and tend to reflect their state, providing direct feedback and freeing the user from having to remember whether she set a preference properly. This helps users accommodate the controls and even co-opt them in ways the designer may not have intended.

[45] Chalmers and Galani.

[46] Bellotti and Sellen.

[47] Gavin Jancke, Gina Danielle Venolia, Jonathan Grudin, J. J. Cadiz, and Anoop Gupta, "Linking Public Spaces: Technical and Social Issues," Conference on Human Factors in Computing Systems (CHI 01; Seattle, 2001).

While some fine-grained controls may be unavoidable, the flexibility they are intended to provide is often lost to their neglect (see Pitfall 3), which is then compensated for by the nuanced manipulation of coarse-grained controls across devices, applications, artifacts, and time.

21.3.2.5 Evidence: Falling into the pitfall

E-commerce web sites typically maintain users' shopping histories.[48] While this informs useful services like personalization and collaborative filtering, there are times when a shopper does not want the item at hand to be included in his actionable history; he effectively wants to shop anonymously during the current session (beyond the private transaction record in the merchant's database). For example, the shopper may not want his personalized shopping environmentwhich others can see over his shoulderto reflect this private purchase. In our experiences, we have encountered no web sites that provide a simple mechanism for excluding the current purchase from our profiles.

[48] Lorrie Faith Cranor, "'I Didn't Buy it for Myself': Privacy and Ecommerce Personalization," Proceedings of the 2003 ACM Workshop on Privacy in the Electronic Society (Washington, D.C., Oct. 30, 2003).

Similarly, some web browsers still bury their privacy controls under two or three layers of configuration panels.[49] While excessive configuration may itself be a problem (see Pitfall 3), the issue here is that there is typically no top-level control for switching between one's normal cookie policy and a "block all cookies" policy. Third-party applications that elevate cookie control widgets have begun to appear (e.g., GuideScope.com).

[49] Millett et al.

Further, wearable locator badges like those described in Harper et al.[50] and Beckwith[51] do not have power buttons. One could remove the badge and leave it somewhere else, but simply turning it off would at times be more practical or preferable.

[50] Harper, Lamming, and Newman.

[51] Beckwith.

21.3.2.6 Evidence: Avoiding the pitfall

Systems that expose simple, obvious ways of halting and resuming disclosure include easily coverable cameras,[52] mobile phone power buttons, instant messaging systems with invisible modes, the In/Out Board,[53] and our Faces prototype.

[52] Bellotti and Sellen.

[53] Dey, Salber, and Abowd.

21.3.2.7 Pitfall 5: Inhibiting established practice

Designers should beware inhibiting existing social practice. People manage privacy through a range of established, often nuanced, practices. For simplicity's sake, we might divide such practices into those that are already established and those that will evolve as new media of disclosure emerge. While early designs might lack elegant support for emergent practicesbecause, obviously, substantive practice cannot evolve around a system until after deploymentdesigns can at least take care to avoid inhibiting established practice.

This is effectively a call to employ privacy design patterns. Designers of privacy-affecting systems can identify and assess the existing disclosure practices into which their systems will be introduced. By supportingand possibly enhancingthe roles, expectations, and practices already at play in these situations, designs can accommodate users' natural efforts to transfer existing skills to new media.

Certain metapractices are also worth noting. In particular, we emphasize the broad applicability of plausible deniability (whereby the potential observer cannot determine whether a lack of disclosure was intentional)[54], [55] and disclosing ambiguous information (e.g., pseudonyms, imprecise location). These common, broadly applicable techniques allow people to finesse disclosure through technical systems to achieve nuanced social ends. Systems that rigidly belie metapractices like plausible deniability and ambiguous disclosure may encounter significant resistance during deployment.[56]

[54] B. A. Nardi, S. Whittaker, and E. Bradner, "Interaction and Outeraction: Instant Messaging in Action," Conference on Computer Supported Cooperative Work (CSCW 00; New York, 2000).

[55] A. Woodruff and P. M. Aoki, "How Push-to-Talk Makes Talk Less Pushy," International Conference on Supporting Group Work (GROUP 03; Sanibel Island, FL, Nov. 2003).

[56] Lucy Suchman, "Do Categories Have Politics? The Language/Action Perspective Reconsidered," in Human Values and the Design of Computer Technology, Batya Friedman (ed.), 91106 (Stanford, CA: Center for the Study of Language and Information, 1997).

Technical systems are notoriously awkward at supporting social nuance.[57] Interestingly, however, systems that survive long enough in the field often contribute to the emergence of new practices even if they suffer from socially awkward design in the first place (e.g., see Green et al.[58] and boyd[59]). In other words, emergent nuance happens. But being intrinsically difficult to predict, seed, and design for, it generally does not happen as optimally as we might like it to. Designers will continue to struggle to support emergent practices, but by identifying successful privacy design patterns, they can at least help users transfer established skills to new technologies and domains.

[57] Mark S. Ackerman, "The Intellectual Challenge of CSCW: The Gap Between Social Requirements and Technical Feasibility," Human-Computer Interaction 15, no. 2-3 (2000): 181-203.

[58] Nicola Green, Hazel Lachoee, and Nina Wakeford, "Rethinking Queer Communications: Mobile Phones and Beyond." Sexualities, Medias and Technologies Conference (University of Surrey, Guildford, UK, June 21-22, 2001).

[59] danah boyd, "Friendster and Publicly Articulated Social Networks." Conference on Human Factors in Computing Systems (Vienna, Austria, 2004).

21.3.2.8 Evidence: Falling into the pitfall

Some researchers envision context-aware mobile phones that disclose the user's activity to the caller to help explain why his call was not answered.[60] But this prohibits users from exploiting plausible deniability . There can be value in keeping the caller ignorant of the reason for not answering.

[60] D. Siewiorek, A. Smailagic, J. Furukawa, A. Krause, N. Moraveji, K. Reiger, J. Shaffer, and F. Wong, "Sensay: A Context-Aware Mobile Phone." IEEE International Symposium on Wearable Computers (White Plains, NY, 2003).

Location-tracking systems like those described in Harper et al.[61] and Beckwith[62] constrain the user's ability to incorporate ambiguity into location disclosures. Users can convey only their concise location orwhen permittednothing at all.

[61] Harper et al.

[62] Beckwith.

Returning to the privacy controversy surrounding Google's email system, one possible reason for people's discomfort with Gmail's content-triggered advertising is its inconsistency with the long-established expectation that the content of one's mail is for the eyes of the sender and the recipient only. With respect to this pitfall, the fact that Gmail discloses no private information to advertisers, third parties, or Google employees is not the issue. The issue is the plain expectation that mail service providers (electronic or physical) will interpret a correspondence's metadata (electronic headers or physical envelopes) but never its contents. Many people would express discomfort if the U.S. Postal Service employed robots to open people's mail, scan the contents, reseal the envelopes, and send content-related junk mail to the recipient. Even if no private information ever left each robot, people would react to the violation of an established social expectation, namely the inviolabilityunder normal conditionsof decidedly private communications.

21.3.2.9 Evidence: Avoiding the pitfall

Mobile phones, push-to-talk phones,[63] and instant messaging systems[64] let users exploit plausible deniability by not responding to hails and not having to explain why.

[63] Woodruff and Aoki.

[64] Nardi, Whittaker, and Bradner.

Although privacy on the Web is a common concern, a basic function of HTML allows users to practice ambiguous disclosure : forms that let users enter false data facilitate anonymous account creation and service provision.

Tribe.net supports another established practice. It allows users to cooperatively partition their social networks into tribes, thereby letting both preexisting and new groups represent themselves online, situated within the greater networks to which they are connected. In contrast, Friendster.com users each have a single set of friends that cannot be functionally partitioned.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net