22.5. Beyond the Browser
Privacy Bird, in its current form, is useful mostly as a tool to raise user awareness about web site privacy practices. By integrating it with a cookie manageras Microsoft and Netscape have done with their P3P user agentswe could help facilitate more meaningful automated cookie management. If enough sites adopt P3P, and P3P
become widely used, the increased transparency about web site privacy practices may lead to the adoption of more privacy-friendly policiesas a result of either market forces or new regulations.
Avoid the use of privacy jargon
. Privacy terms used by experts are unfamiliar to most users. However, all of the P3P user agents we examined used jargon such as
Provide persistent privacy-
. The Privacy Bird icon serves as an indicator of web site privacy practices that users could always find in their browser window. Users
having the ability to get high-level privacy-related information at a glance.
Provide meaningful summaries of privacy-related information in standardized formats
. Privacy-related information is complicated and can be time consuming and difficult for users to read. Users appreciate short summaries, as long as these summaries do not hide critical information. Standardized formats allow them to find the information of most interest quickly. For P3P user agents, summaries should highlight data sharing and marketing practices, as well as
information. Summary information may combine information about multiple aspects of privacy or reduce the granularity of information if this will help users understand the information being conveyed to them or allow them to more easily make configuration decisions.
Provide mechanisms for accessing detailed information
important that information be available to explain why privacy warnings were raised or protective actions such as blocking a cookie were taken.
Make configuration fast and easy
. Users don't want to
a lot of time configuring privacy tools, but they will be upset if the tools do not do what they are expecting them to do, or if they interfere with their normal activities. Reasonable defaults should be provided along with a number of easily accessible alternative options.
Allow users to fine-tune their configuration
. Users have nuanced privacy preferences, which they may want to
more explicitly over time. Tools should allow users to fine-tune their configurations as needed.
meaningful information to users about the agent's capabilities and current state
. Most users do not have a good understanding of privacy issues or of privacy software, and thus may have misconceptions about what the software actually does. It is important that users do not assume that their privacy software is providing protection that it is not capable of providing. Likewise, software that has capabilities that can be turned on or off should clearly
its state to users.
Provide educational opportunities to users over time as they use the tool
. While users may be reluctant to read a lot of material up front, there seems to be an interest in learning more about privacy over time through use of a privacy user agent.
Today it is quite difficult for individuals to take privacy into consideration while comparison shopping. Web sites exist that compare similar products based on user reviews. Other sites compare price and shipping charges across
that offer the same product. But consumers who wish to purchase a product from the site that has the best privacy practices must tediously compare lengthy
privacy policies across many sites. If all of the sites under consideration were P3P enabled, a consumer could visit these sites with a P3P user agent and determine which ones best meet their privacy preferences. However, the comparison process would be eased by a tool that could perform the comparison directly. I can imagine the addition of a privacy comparison feature to any of the price comparison services currently available. To facilitate privacy comparisons more
, it would be useful to have such a feature built into a general-purpose search engine.
Simon Byers, David Kormann, Patrick McDaniel, and I have developed a prototype P3P-enabled search engine using the Google API.
Our prototype returns a Google-style search result page with Privacy Bird icons annotating each result to indicate whether it matches the privacy preference level configured by the user. We have demonstrated the feasibility of such a service and have experimented with ways to reduce the associated performance overhead. Further work is needed on the best approach to configuring preferences and displaying results. For example, we would like to investigate how to reorder search results so that sites with better privacy policies appear toward the top while ensuring that top search results are good matches to users' queries.
To realize the vision I introduced at the beginning of this chapter in which computer-readable privacy policies were associated with all automated data collection, tools are needed that can detect the presence of data collection devices, read their privacy policies, and take appropriate actions. In some cases, these tools might be able to signal back to the data collection device that the user does not want his data collected, and the device might respond by turning off data collection until the user is no longer in proximity. Devices might also be able to take steps to anonymize data upon requestfor example, substitute an image of an "anonymous face" on a video recording.
When data collection cannot be suppressed, privacy tools might alert their users and suggest routes that will avoid these devices.
Besides helping users avoid data collection, privacy tools may also facilitate controlled sharing of data. For example, in ubiquitous computing environments, users may wish to advertise their location or presence to
, or to devices that might perform useful services, while preventing other people and devices from gaining access to this information. In this case, privacy rules might take into account not only privacy policies, but also information about the user's relationship with other individuals and the types of services
by devices. Semantic knowledge captured using semantic web tools could facilitate the creation of and reasoning about such rules.
There is clearly a lot of work to be done before it will be prudent to
to an automated agent the many nuanced privacy-related decisions we make on a daily basis.
Work on the problems of capturing privacy preferences and displaying privacy-related information is bringing us closer to this vision.