Section 31.2. Groove Virtual Office Design


31.2. Groove Virtual Office Design

In the following sections, we look at the key design issues for Groove Virtual Office.

31.2.1. The Weakest Link

In Secrets and Lies, Bruce Schneier concludes that correct cryptographic algorithms are necessary but not sufficient for creating secure systems.[1] Indeed, in complex systems, the cryptographic algorithms are the last place that anyone attacks. Why exploit a weakness in the random number generator when users will launch any email attachment that promises them a visual thrill? Why set up an elaborate man-in-the-middle attack when users are likely to write their passwords down on sticky notes? A system is only as secure as its weakest link, and in many systems, the weakest link is often the user.

[1] Bruce Schneier, Secrets and Lies (Indianapolis: Wiley Publishing, Inc., 2000).

Of course, security professionals have been aware of the limitations of the user ever since Troy accepted a free horse. There are two common user interface techniques for strengthening the user link:

Figure 31-1. Groove Workspace was used in the Iraq War to assess the humanitarian assistance needs of people affected by the fighting; Groove was chosen for its ability to work in austere networking infrastructures


  • Prevent the user from doing the wrong thingfor example, enforce strong password policies

  • Teach the user how to do the right thingfor example, teach people how to choose a strong password

Both techniques have their place, but unfortunately, both also have limitations. Jakob Nielsen writes that as passwords become stronger (and thus more difficult to remember), users tend to write their passwords down on sticky notes.[2] Thus, Nielsen argues, enforcing strong passwords weakens security as a direct consequence of being less usable. We would argue, however, that the effect on security is subtler. If strong passwords are enforced, remote attacks (i.e., attacks over the Internet) become harder but insider attacks become easier (because passwords will be on sticky notes).

[2] Jakob Nielsen, "Alert Box" (Nov. 26, 2000); http://www.useit.com/alertbox/20001126.html.

Teaching the user to do the right thing also has tradeoffs. For most users, computers are tools used to get their job done. Most users are not motivated to learn about security procedures. When we first designed Groove, we decided to encourage people to use strong passwords by prompting for a passphrase in our login and account creation screens. We felt that this would be a subtle reminder that people could and should use a phrase rather than a word for their login credentials. Unfortunately, this minor deviation from user expectation ended up causing more confusion than it was worth. In usability testing, we found that users struggled in the account creation screens. Several users were confused by the word passphrase and were not sure what to enter. Moreover, this unfamiliar term reinforced the perception that Groove Virtual Office was nonstandard and thus more complicated than software people were used to. This would lead users to choose to send sensitive information via email rather than Groove. Because email is often sent in the clear, this left the user worse off from a security perspective. However, the performance of workers is rarely measured by the security measures they take. For this reason, they are likely to choose the medium that allows them to accomplish their task (sending a file) quickly and comfortably. Thus, given a choice between an easy solution that is insecure and a difficult solution that is secure, the user is likely to choose the former. Groove Virtual Office had to be a solution that is both easy to use and secure.

31.2.2. Do the Right Thing

From experiences like these, we concluded that we needed a flexible security model. This was all the more important given the different security needs of our diverse target audienceranging from office workers who mainly use email and Microsoft Word, to military and intelligence personnel. Each user group requires different levels of security but also has different tolerance levels for security inconveniences. The security model of Groove Virtual Office accommodates all of these groups. Depending on the needs and abilities of the user, and the properties of the networking environment, Groove applies one or both of the two user interface (UI) techniques (i.e., enforcing and teaching) to maximize the security of the system.

The main user groups for Groove are:

  • Office workers. Most office workers want to think about their work, not their infrastructure. They tend to avoid products that force them to learn new concepts, and they want security to be as invisible as possible. For these users, Groove relies on a centralized server to enforce security policies and to serve as a certification authority. As we describe later in this chapter, a central server allows Groove to provide secure communications without any user intervention or effort.

  • Road warriors. Another type of user that Groove Virtual Office needs to satisfy is the road warrior. People who travel in their business may have to deal with multiple central servers (their company's server and their client's server, for example). They may even have to make do without one at all. Groove uses various UI techniques (such as colors and prompts) to help the user understand how to work securely. For example, Groove can warn users if they are communicating with someone who has not been certified by their own central server.

  • Military and intelligence users. Finally, Groove needs to support users in the military and intelligence community. Groove's architecture works particularly well in austere networking environments such as ad hoc field networks with only occasional Internet connectivity. These users need a high level of security but cannot rely on a centralized server to enforce policies or manage identities and access control. Moreover, these users often communicate with people across organizational and, sometimes, warfront boundaries. Groove allows these users to communicate securely, even without a central server, by teaching them to authenticate each other directly. Direct authentication (described later in this chapter) guarantees secure communications over insecure infrastructure without requiring a trusted third party.

We knew that forcing the office worker to use the intelligence users' level of security would lead to user frustration and, eventually, abandonment of our product. The challenge, then, was to find a way to serve all of these user types with all of their various environments and expectations. This led us to our primary guiding principle: a flexible approach to security that keeps the users and their environments in mind will, in the end, be significantly stronger than one that merely mandates security upon the user.

Secure communications in Iraq are a good example. Naturally, the U.S. military has secure, closed networks such as SIPRNET that were designed for the transmission of highly sensitive data. Nevertheless, because the mission in Iraq involves reconstruction projects as well as military operations, soldiers often need to communicate with humanitarian organizations, private contractors, and local Iraqi civilians. Most of these people do not have access to SIPRNET. Instead, some battalions in Iraq now use Groove to communicate both with their rear command and with humanitarian and civilian organizations. Because Groove does not mandate a central certification authority, it can be deployed without an administrator. In addition, because Groove can send encrypted communications over insecure networks, it allows Iraqi civilians to communicate with the U.S. military securely.

The security architecture of Groove Virtual Office takes advantage of the environment that it finds itself in to maximize the security for the user, given the constraints of the system. In other words, Groove scales the security offered to the user based on the desires and abilities of the user and the existing infrastructure. This approach allowed us to satisfy Ozzie's core principles.

31.2.3. Is That You, Alice?

An example of this flexible approach is the design of the identity and authentication system in Groove Virtual Office. All communications and collaboration systems need to convey the identity of users to each other, and questions inevitably arise:

  • Is that really a message from Alice?

  • How do I know that the Alice Smith in the directory is really the Alice Smith that I know (and not an impostor)?

  • How do I know that Alice Smith is the Alice Smith that works in my company and not a similarly named Alice Smith who works for a competitor?

There are well-known algorithms for securely answering all these questions. Most systems use public-private key cryptography to prove that the owner of a given public key composed a message. Next, most systems use a centralized certification authority to vouch that the owner of a given public key is named, for example, "Alice Smith." Finally, most systems have a centralized, hierarchical directory that disambiguates among different people with the same name.

Groove, unfortunately, cannot always rely on a central server for certifying and disambiguating identities. Relying on a central server makes sense in many cases, but Groove Virtual Office must be able to function without one. For example, negotiators in the talks between the Sri Lankan government and Tamil Tiger rebels used Groove Virtual Office. Neither side wanted the other to run a certifying server. Even cross-certification was unacceptable because of intense political sensitivities.[3] Groove's flexible approach pays off in this situation because it supports direct authentication. With direct authentication, two parties who want to communicate securely can authenticate each other without having to trust a third party or even each other. In the case of the Sri Lanka peace talks, this allowed the two sides to communicate in a neutral space that no one controlled. In the corporate world, direct authentication allows you to securely communicate with an external party without having to wait for your IT department to issue a certificate.

[3] "To Engender and Sustain a Holistic and Integrated Peacebuilding Process in Sri Lanka"; http://www.info-share.org/.

31.2.4. Colorful Security

Groove uses public key cryptography to provide message-level security, without requiring user intervention, special knowledge, or centralized administration. For example, when Bob receives a message from Alice, Groove displays the message as originating from a person named "Alice." The message is encrypted with Bob's public key, so no one but Bob can read it. The message is also signed with Alice's private key, so only the person possessing that private key could have written it.

A problem remains. Although the underlying Groove cryptographic layer protects the message cryptographically, it cannot guarantee that the message was written by the Alice that Bob expects. The Groove user interface must help Bob to determine whether Alice is an impostor. When Bob sees that he has a message from Alice, the name "Alice" will be displayed in a color that represents the authentication level for Alice. The authentication color answers the question: "Which Alice is this?" If Bob and Alice are both employees of the same company, their management server will vouch for the identity of Alice and Bob.[4] In that case, Alice's name will display in teal on Bob's screen (and vice versa).

[4] Cryptographically speaking, Alice's Groove identity certificate is signed with the management server's Certificate Authority (CA) private key. Because Bob knows the server's CA public key (it's his server too) he can trust that Alice's identity is valid, at least as far as the management server is concerned.

But what if Bob and Alice don't work for the same company? The companies can choose to cross-certify each other's employees. The Groove UI automatically helps Bob recognize Alice by displaying Alice's name in blue, a color designated for certified contacts outside the company. We chose to explicitly distinguish between certified contacts within the company and those outside the company so that users can easily detect company affiliations of a contact list before exchanging company-confidential information.

In some cases, however, the additional administrative overhead of cross-certification is not desired. This is where more user training is required. The most secure way is for Bob to use an out-of-band method (such as a phone conversation) to verify that the public key that he sees for Alice is what Alice knows her public key to be. To make this easier, the Groove user interface displays a fingerprint[5] for each person displayed in the UI. Once verified, Bob can directly authenticate Alice. Thereafter, all messages and communications from Alice will show Alice's name in green. This method of authentication is similar to PGP's web of trust model.

[5] The fingerprint is a cryptographic hash of the public key and thus is easier to compare manually.

Groove will also warn the user if there is already a different user with the name Alice. That is, if two people with different public keys both claim to be Alice, both Alices are shown in red and Bob is asked to disambiguate the two. The Groove UI shows the user the various contexts in which the two Alices have communicated with Bob. For example, the user interface might indicate that the first Alice is in a marketing workspace with Bob and the second Alice is in a family photo-sharing workspace. Bob can use this information to disambiguate the two Alices, and then alias[6] one or both of the Alices so that he can keep them straight in the Groove user interface. This technique protects Bob from spoofing attacks after he has communicated with the real Alice. Any subsequent people claiming to be Alice will appear in red in the UI, and Bob can determine whether the second Alice is an impostor. Figure 31-2 shows the Resolve Name Conflict screen.

[6] An alias is a user-local name given to a contact; Groove does not share your aliases with others.

The technique of displaying authentication and certification information with a name is what makes it possible for Groove to support different user groups. Office workers in an enterprise environment do not have to worry about security because a management server authenticates their contacts. An office worker will generally see all names as teal (certified) without taking additional steps. Road warriors, on the other hand, can use the colors to distinguish between people inside their own company (who show as teal), people who are trusted but are outside their company (who show as blue), and people who are unauthenticated (who show as black). In addition, all users can benefit from notifications about duplicate names (which show as red). Finally, users in the intelligence community can be trained to use direct authentication without any server infrastructure (out-of-band exchange of fingerprints) to authenticate users (who show as green). These color codes are shown in Figure 31-3.

Figure 31-2. The Resolve Name Conflict dialog box allows the user to disambiguate two people with the same name but different public keys; being able to handle different people with the same name is crucial for software that requires no central naming authority; this is also the first line of defense against spoofing attacks


Are five distinct authentication and certification colors too many? Will they overwhelm users and add unneeded complexity? Will users find them distracting or useful? Will users understand the significance of these colors? These are questions we asked when testing and adapting the usability of our "flexible" UI security model. What we have found is that the system works well because it evolves with the user's environment and use of GVO. For example, office workers see and need to learn only three of the five colors that are pertinent to their role:

  • Teal (for certified users within their company)

  • Black (warning for uncertified users)

  • Red (high warning for name conflict and possible attack)

Figure 31-3. Groove keeps track of how each person is authenticated; some people (shown in green) are directly authenticated, which means that the user has validated the digital fingerprint of the person's contact; other people (shown in teal) are certified by the organization's certificate authority; those not authenticated are shown in black, and those with name conflicts are shown in red


Two additional colors are displayed only for more advanced users:

  • Road warriors in multiple environments (blue for certified users outside of their company)

  • Intelligence analysts in austere circumstances (green for direct authentication)

Because color coding may, at first, appear new and foreign to users, we employ standard UI techniques such as tooltips and context-sensitive help to teach users these (as of yet) nonstandard features. Of course, as we receive more user feedback, we will adapt our model to satisfy our diverse user community. For example, we may provide end-user and administrative tools to customize the colors according to their local environment and preferences.

Most importantly, all three of these user communities can interact with each other using their own local policies and without compromising security. For example, an intelligence analyst does not have to trust an office worker's administrator to communicate with the office worker. Because the intelligence analyst can directly authenticate the office worker, the security of the authentication is in the control of the intelligence analyst and can be as secure as he wishes. In contrast, if the intelligence analyst had to rely on an administrator's certification, the administrator would be another (possibly vulnerable) link in the security chain.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net