Section 2.4. Panorama: Understanding the Importance of the Environment


2.4. Panorama: Understanding the Importance of the Environment

The environment surrounding the process of developing security is also extremely important to the effective operation of the product (security mechanism). During the design of the security, a number of factors not necessarily related to any security needs can influence the final product. The personal responsibility of participants for the resulting security, the enthusiasm of high-level management for security (even their presence in the design process), pressure to achieve functional features, time-to-market, personal agendas, ethical constraints, industrial involvement, and legal requirementsall influence security design in one way or another.[23]

[23] Flechais, Sasse, and Hailes.

The cultural panorama surrounding security does not stop affecting it after the design is complete, but continues even after it has been put to use. An analysis by Flechais, Riegelsberger, and Sasse[24] has identified the influences and mechanics of trust relationships in the operation of secure systems. In most current cases, existing trust relationships in an organization facilitate the breaking of security policies and practices. In fact, given the right (and currently widespread) environment, simply adhering to existing security policies can undermine social relationships within a group of peers. The authors argue that trust relationships are beneficial for organizations by promoting social capital [25] (i.e., trust based on shared informal norms that promote cooperation)[26] and that the organizational culture and the actual security should be designed to support both trust relationships and adherence to policy.

[24] Ivan Flechais, Jens Riegelsberger, and Angela M. Sasse, "Divide and Conquer: The Role of Trust and Assurance in the Design of Socio-Technical Systems," Technical Report, 2004.

[25] Mitnick and Simon.

[26] Brostoff and Sasse, 2001.

2.4.1. The Role of Education, Training, Motivation, and Persuasion

While a well-designed security mechanism won't put off users, it also won't entice them to make the extra effort that security requires. In many home and organizational contexts, users lack the motivation to make that extra effort. User education and training can be used to explain the need for the effort that security requires, but changing users' knowledge and understanding does not automatically mean they will change their behavior. Dhamija and Perrig,[27] for instance, found that a sample of users with weak passwords had "received relevant training" and did know how to construct strong passwords; however, they chose not to comply with the request to construct strong passwords. The first point to make here is that there is a difference between education and training: while education is largely about teaching concepts and skills, training aims to change behavior through drill, monitoring, feedback, reinforcement, andin the case of willful noncompliancepunishment. Because social engineering attacks often bypass technology altogether to obtain access or information, Mitnik and Simon[28] emphasize that effective security education and training should:

[27] Dhamija and Perrig.

[28] Mitnick and Simon.

  • Not only focus on correct usage of security mechanisms, but also address other behaviorsfor example, checking that callers are who they claim to be

  • Encompass all staff, not only those with immediate access to systems deemed at risk

Many organizations simply provide security instructions to users and expect these instructions to be followed. The material disseminated may even threaten punishment. However, the threat of punishment alone won't change users' behaviorrather, if users see that rules are not enforced, they will lose respect for the security in general, and the result is a declining security culture . Even though some security experts advocate rigorous punishment as a way of weeding out undesirable user behavior, this is not an easy option. Policing undesirable behaviordetection and punishmentrequires considerable resources, can be detrimental to the organizational climate, and may have undesirable side effects (such as increasing staff turnover). Given that sanctions have an effect only if they are applied, and given that there may be undesirable side effects, an organization would be well advised to specify sanctions only for a small set of key behaviors that it deems to be putting key assets at risk.

Weirich and Sasse[29] identified a set of beliefs and attitudes held by many users who do not comply with security policies:

[29] Weirich and Sasse, 2001.

  • Users do not believe they are personally at risk.

  • Users do not believe they will be held accountable for not following security regulations.

  • The behavior required by security mechanisms conflicts with social norms.

  • The behavior required by security mechanisms conflicts with users' self-image. (The perception is that only "nerds" and "paranoid" people follow security regulations.)

There can be no doubt that security in general, and IT security in particular, currently suffers from an image problem . Education campaigns (similar to those employed in health education) can be effective only if they make users believe that something they care about is at risk. In the most recent CSI/FBI survey,[30] the overwhelming majority of respondents stated that their company needed to invest in raising security awareness. When users cannot be motivated, persuasion needs to be employed. In this chapter, we present some examples of persuasion designed to improve security behavior in the corporate context; Fogg[31] offers techniques for designing applications and interfaces that intrigue, persuade, and reward users to achieve desired user behavior in general.

[30] Ninth Annual CSI/FBI Survey on Computer Crime and Security (2004);http://www.gocsi.com/.

[31] B. J. Fogg, Persuasive Technology. Using Computers to Change What We Think and Do (San Francisco: Morgan Kaufmann, 2003).

2.4.2. Building a Security Culture

Earlier in this chapter, we emphasized the importance of having organizations integrate security into their business processes, and argued that the best motivation for users to exhibit desired security behavior is if they care about what is being protected, and understand how their behavior can put these assets at risk. These two arguments provide the foundation for the next key point: organizations must become actively involved in security design. They need to build a security culture as much as they need to build a system of technical countermeasures. Although some organizations understand that risk analysis is the bedrock of security design, many still do not understand the role of security in their business/production processes. Too many organizations are still copying standard security policies, deploying standard security mechanisms, and leaving decisions about security largely to security experts. Security decisions are then often made in an ad hoc fashion, as a "firefighting" response to the latest threat.

Organizations need to become actively involved in making decisions about what should be protected, and how. This requires performing a risk and threat analysis , and making decisions based on what makes economic sense for the business, instead of trying to meet abstract standards set by security experts. Although many companies already use risk analysis methods, they often fail to consider the interests and needs of all stakeholderssuch as usersand the economics of security are currently not well understood.

Once specific security goals appropriate to the organization have been established, role models are essential to change behavior and rebuild the security culture. This will require buy-in from the top. Currently, senior managers sometimes exhibit bad security behavior because they believe that they are too important to bother with "petty" security policies. The security experts to whom they have delegated responsibility for the organization's responsibility are often unable to force senior managerswho have the power to fire themto comply with security policies. We would argue that the ultimate responsibility for security, as for safety, always should lie with senior management. Security experts can advise, implement, and monitor, but they cannot take sole responsibility for making an organization's security work.

An additional approach worth considering is to make secure behavior a desirable trait. This can be done through social marketing or by making such behavior part of professional and ethical norms.[32] Organizations that deal with confidential customer data, for instance, must make clear to all of their staff that they have a duty to safeguard such data from unauthorized access or tampering.

[32] Sasse, Brostoff, and Weirich.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net