Section 26.3. Bootstrapping, Confidence, and Reputability


26.3. Bootstrapping, Confidence, and Reputability

Another area where human factors are critical in privacy is in bootstrapping new systems. Because new systems start out with few users, they initially provide only small anonymity sets. This starting state creates a dilemma: a new system with improved privacy properties will attract users only once they believe it is popular and therefore has high anonymity sets; but a system cannot be popular without attracting users. New systems need users for privacy, but need privacy for users.

Low-needs users can break the deadlock. The earliest stages of an anonymizing network's lifetime tend to involve users who need only to resist weak attackers who can't know which users are using the network and thus can't learn the contents of the small anonymity set. This solution reverses the early-adopter trends of many security systems: instead of first attracting the most security-conscious users, privacy applications must begin by first attracting low-needs users and hobbyists.

But this analysis relies on users' accurate perceptions of present and future anonymity set size. As in market economics, expectations themselves can bring about trends: a privacy system that people believe to be secure and popular will gain users, thus becoming (all things being equal) more secure and popular. Thus, security depends not only on usability, but also on perceived usability by others, and hence on the quality of the provider's marketing and public relations. Perversely, over-hyped systems (if they are not too broken) may be a better choice than modestly promoted ones, if the hype attracts more users.

Yet another factor in the safety of a given network is its reputability: the perception of its social value based on its current users. If I'm the only user of a system, it might be socially accepted, but I'm not getting any anonymity. Add a thousand Communists, and I'm anonymous, but everyone thinks I'm a Commie. Add a thousand random citizens (cancer survivors, privacy enthusiasts, and so on) and now I'm hard to profile.

The more cancer survivors on Tor, the better for the human rights activists. The more script kiddies, the worse for the normal users. Thus, reputability is an anonymity issue for two reasons. First, it impacts the sustainability of the network: a network that's always about to be shut down has difficulty attracting and keeping users, so its anonymity set suffers. Second, a disreputable network attracts the attention of powerful attackers who may not mind revealing the identities of all the users to uncover the few bad ones.

While people therefore have an incentive for the network to be used for "more reputable" activities than their own, there are still tradeoffs involved when it comes to anonymity. To follow the previous example, a network used entirely by cancer survivors might welcome some Communists onto the network, although of course they'd prefer a wider variety of users.

The impact of public perception on security is especially important during the bootstrapping phase of the network, in which the first few widely publicized uses of the network can dictate the types of users it attracts next.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net