Section 22.5. Beyond the Browser


22.5. Beyond the Browser

Privacy Bird, in its current form, is useful mostly as a tool to raise user awareness about web site privacy practices. By integrating it with a cookie manageras Microsoft and Netscape have done with their P3P user agentswe could help facilitate more meaningful automated cookie management. If enough sites adopt P3P, and P3P user agents become widely used, the increased transparency about web site privacy practices may lead to the adoption of more privacy-friendly policiesas a result of either market forces or new regulations.[31] However, the real potential for use of an automated privacy policy framework may lie in applications that go beyond the web browser.

[31] Lorrie Cranor and Rigo Wenning, "Why P3P Is a Good Tool for Consumers and Companies," Gigalaw.com (April 2002); http://www.gigalaw.com/articles/2002/cranor-2002-04.html.

ADVICE FOR PRIVACY SOFTWARE DEVELOPERS

  • Avoid the use of privacy jargon. Privacy terms used by experts are unfamiliar to most users. However, all of the P3P user agents we examined used jargon such as access, profiling, third-party cookie, and implicit consent.

  • Provide persistent privacy-related indicators. The Privacy Bird icon serves as an indicator of web site privacy practices that users could always find in their browser window. Users reported that they liked having the ability to get high-level privacy-related information at a glance.

  • Provide meaningful summaries of privacy-related information in standardized formats. Privacy-related information is complicated and can be time consuming and difficult for users to read. Users appreciate short summaries, as long as these summaries do not hide critical information. Standardized formats allow them to find the information of most interest quickly. For P3P user agents, summaries should highlight data sharing and marketing practices, as well as opt-out information. Summary information may combine information about multiple aspects of privacy or reduce the granularity of information if this will help users understand the information being conveyed to them or allow them to more easily make configuration decisions.

  • Provide mechanisms for accessing detailed information, Users have differing privacy information needs. While most have narrow interests in privacy policy information, some want to see more detailed information. This information should be readily available to users who want it. It is especially important that information be available to explain why privacy warnings were raised or protective actions such as blocking a cookie were taken.

  • Make configuration fast and easy. Users don't want to spend a lot of time configuring privacy tools, but they will be upset if the tools do not do what they are expecting them to do, or if they interfere with their normal activities. Reasonable defaults should be provided along with a number of easily accessible alternative options.

  • Allow users to fine-tune their configuration. Users have nuanced privacy preferences, which they may want to articulate more explicitly over time. Tools should allow users to fine-tune their configurations as needed.

  • Convey meaningful information to users about the agent's capabilities and current state. Most users do not have a good understanding of privacy issues or of privacy software, and thus may have misconceptions about what the software actually does. It is important that users do not assume that their privacy software is providing protection that it is not capable of providing. Likewise, software that has capabilities that can be turned on or off should clearly indicate its state to users.

  • Provide educational opportunities to users over time as they use the tool. While users may be reluctant to read a lot of material up front, there seems to be an interest in learning more about privacy over time through use of a privacy user agent.


Today it is quite difficult for individuals to take privacy into consideration while comparison shopping. Web sites exist that compare similar products based on user reviews. Other sites compare price and shipping charges across vendors that offer the same product. But consumers who wish to purchase a product from the site that has the best privacy practices must tediously compare lengthy human-readable privacy policies across many sites. If all of the sites under consideration were P3P enabled, a consumer could visit these sites with a P3P user agent and determine which ones best meet their privacy preferences. However, the comparison process would be eased by a tool that could perform the comparison directly. I can imagine the addition of a privacy comparison feature to any of the price comparison services currently available. To facilitate privacy comparisons more generally, it would be useful to have such a feature built into a general-purpose search engine.

Simon Byers, David Kormann, Patrick McDaniel, and I have developed a prototype P3P-enabled search engine using the Google API.[32] Our prototype returns a Google-style search result page with Privacy Bird icons annotating each result to indicate whether it matches the privacy preference level configured by the user. We have demonstrated the feasibility of such a service and have experimented with ways to reduce the associated performance overhead. Further work is needed on the best approach to configuring preferences and displaying results. For example, we would like to investigate how to reorder search results so that sites with better privacy policies appear toward the top while ensuring that top search results are good matches to users' queries.

[32] Simon Byers, Lorrie Cranor, Dave Kormann, and Patrick McDaniel, "Searching for Privacy: Design and Implementation of a P3P-Enabled Search Engine," Proceedings of the 2004 Workshop on Privacy Enhancing Technologies (PET 2004; Toronto, May 26-28, 2004).

To realize the vision I introduced at the beginning of this chapter in which computer-readable privacy policies were associated with all automated data collection, tools are needed that can detect the presence of data collection devices, read their privacy policies, and take appropriate actions. In some cases, these tools might be able to signal back to the data collection device that the user does not want his data collected, and the device might respond by turning off data collection until the user is no longer in proximity. Devices might also be able to take steps to anonymize data upon requestfor example, substitute an image of an "anonymous face" on a video recording.[33] When data collection cannot be suppressed, privacy tools might alert their users and suggest routes that will avoid these devices.

[33] E. Newton, L. Sweeney, and B. Malin, "Preserving Privacy by De-Identifying Facial Images," IEEE Transactions on Knowledge and Data Engineering 17:2 (2005), 232243.

Besides helping users avoid data collection, privacy tools may also facilitate controlled sharing of data. For example, in ubiquitous computing environments, users may wish to advertise their location or presence to friends or co-workers, or to devices that might perform useful services, while preventing other people and devices from gaining access to this information. In this case, privacy rules might take into account not only privacy policies, but also information about the user's relationship with other individuals and the types of services offered by devices. Semantic knowledge captured using semantic web tools could facilitate the creation of and reasoning about such rules.[34]

[34] F. Gandon and N. Sadeh, "Semantic Web Technologies to Reconcile Privacy and Context Awareness," Web Semantics Journal 1:3, 2004.

There is clearly a lot of work to be done before it will be prudent to entrust to an automated agent the many nuanced privacy-related decisions we make on a daily basis.[35] Work on the problems of capturing privacy preferences and displaying privacy-related information is bringing us closer to this vision.

[35] See Chapter 5, this volume.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net