Nobody Will Ever Call You to Tell You How Well the Network Is Working

Nobody Will Ever Call You to Tell You How Well the Network Is Working

Between the two of us, we have spent about 20 years administering networks and systems. Throughout those 20 years , one thing became increasingly obvious: Nobody will ever call you to let you know how well the network is working . Never in 20 years of network administration did we get a phone call to tell us that the e-mail system was working, that users could print without glitches, and that files were available without problems. The phone calls system administrators receive seem to always come at 0500 on Saturday morning, with the caller screaming about the network being down. That experience taught us two things:

  1. The people who called at 0500 were usually the ones who broke the network in the first place.

  2. Information technology is working properly only when users can stop thinking about how or why it works.

Although there is no sustained learning in the first observation (other than that we always seem to work for the wrong people), the second is an example of what we call the "principle of transparency." Users, unlike us, are not interested in technology for technology's sake. In fact, strange as it may seem, they are not interested in technology at all. The users just want the technology to work so they can get their jobs done without having to think of why or how. The ultimate challenge for information technology is to be invisiblecompletely transparent to the user . Every time users have to think about the technology, it is because something is not working the way it should (or the way they think it should) or because they cannot access resources they want. When a manager has to think about technology, it is usually because she needs to spend more money on it, or because it is no longer working after her eight-year-old downloaded a virus onto the manager's laptop while surfing the Internet as an admin last night. Neither experience is all that pleasant. Fundamentally, the network administrator's job is to make him or herself invisible. This is what makes getting managers to spend money on security so hard. Fundamentally, security management is about spending good money to have nothing happen. Success is measured by the absence of events, not by the presence of them. If nothing happened , you were probably successful protecting the network, or you were just lucky, and you really do not know which!

NOTE: Security management is about spending good money to have nothing happen.


But, Security Will Break Stuff!

So, how does all this relate to network protection? The problem is that whereas network administration is about ensuring that users can get to everything they need, security is about restricting access to things. A colleague of ours used to quip, "Got an access denied ? Good, the security is working." At a basic level, that means that security administration at its core is fundamentally opposed to network administrationthey have, in fact, conflicting goals. Hence, we have an elemental tradeoff that we need to consider.

As mentioned previously, technology must be transparent to users. Transparency can take many forms. The technology should be easy to use. However, the technology-acceptance research in management information systems has proven that technology also needs to be usefulthat it needs to have some kind of compelling functionalityto be accepted by users. (For simplicity's sake, we sometimes group usability and usefulness into the single term usability .) Essentially, the tradeoff is between security and usability or usefulness. An old clich says that the most secure system is one that is disconnected and locked in a safe, then dropped to the bottom of the ocean. Of course, if you have an availability goal, in addition to confidentiality and integrity, this is a suboptimal approach.

This has implications for all software technologies. Take the operating system (OS), for example. The only perfectly secure OS is one that is still in the shrink wrap. After you break the shrink wrap and install the OS, confidentiality and integrity can be compromised. When you install an application on to any operating system, you enable additional functionality that may make the system less secure because it increases the attack surface of the system. The more complexity the system has, the more potential weak points there are. In Chapter 9, "Network Threat Modeling," we discuss the environmental aspects of security hardening and look at how you analyze the usage scenario to optimally harden a system.

We can make any technology more secure, but by doing so we will probably make it less usable. So, how do we make it more secure and more usable? This is where the third axis of the tradeoff comes into play. Any good engineer is familiar with the principle of "good, fast, and cheap." You get to pick any two.

Last year, Jesper was visiting a customer to help them design a network architecture for security. During the discussion, it became clear that people were struggling with the tradeoff between security and usability/usefulness. By making the network more secure in some ways, they would have to make it less usable in other ways. After about 15 minutes of this discussion, he went up to the whiteboard and drew the triangle in Figure 1-2.

Figure 1-2. The fundamental tradeoffs.


Then he turned to the CIO and told him he gets to pick any two of those. The CIO thought about it for a few seconds and then said, "Ok. I'll pick secure and usable." All of a sudden, everyone knew what they had to work with, and the discussion turned toward what resources they needed to make the system both secure and usable.

This fundamental tradeoff between security, usability/usefulness, and cost is extremely important to recognize. Yes, it is possible to have both security and usability/usefulness, but there is a cost, in terms of money, in terms of time, and in terms of personnel. It is possible to make something both cost-efficient and usable, and making something secure and cost-efficient is not very hard. However, making something both secure and usable takes a lot of effort and thinking. Security is not something you can add on to a fundamentally insecure design; the design itself must incorporate security . It is not some kind of holy water you can sprinkle on an existing implementation to anoint it to a higher state of security. Security takes planning, and it takes resources. In addition, you will never be able to, or even want to, become completely secure. What you want is to be secure enough to be protected against the threats you care about. That would represent an optimal design in your environment.

A note of interest here is that this book is about designing to a security policy, not to a resource constraint. We all live within resource constraints. However, when you design security strategies, you need to stop thinking about resources. If you start your design by limiting yourself to the options that fit within your resource constraint, you will almost certainly end up with a suboptimal design, because you will dismiss options that may be important before they are fully understood . Furthermore, you will almost certainly end up with a design that cannot ever become optimal. A much better option is to design a strategy that gets you where you want to be. Then you figure out what the resources are that you have to work with. After you have those, you can rank order the components of the design according to benefit and choose which ones to implement now and which to leave for later. Doing the analysis this way also helps you explain to those who control the resources why what they have given you is insufficient.

System Administrator Security Administrator

Making system or network administrators manage security is counterproductive; those job categories then would have conflicting incentives. As a system or network administrator, your job should be to make systems work, make the technology function without users having to think about it, making the technology transparent. As a security administrator, your job is to put up barriers to prevent people from transparently accessing things they should not. Trying to please both masters at the same time is extremely difficult. Dr. Jekyll/Mr. Hyde may succeed at it (for a time at least), but for the rest of us, it is a huge challenge. The things that will get you a good performance review in one area are exactly what will cost points in the other area. This can be an issue today because many who manage infosec are network or system administrators who are also part-time security administrators. Ideally, a security administrator should be someone who understands system and network administration, but whose job it is to think about security first, and usability/usefulness second. This person would need to work closely with the network/system administrator, and obviously the two roles must be staffed by people who can work together. However, conflict is a necessity in the intersection between security and usability/usefulness. Chances are that only by having two people with different objectives will you be able to find the optimal location on the continuum between security and usability/usefulness for your environment.

How Vendors Can Change the Tradeoff

There are actually several ways to address this tradeoff. Each vendor's technology is used in many different organizations. If we use "effort" as a proxy for the "cheap" axis on the tradeoff, we can see that the amount of effort the vendor expends in making their technology usable as well as secure will offset the amount of effort customers have to expend on the same task. The equation is effectively as follows :


The relationship is not directly one to one because the efficiency and effectiveness of the resources applied to the problem differ . In other words, not everything the vendor does to make the product more secure and usable will actually benefit the customer. However, some portion of the effort that a vendor expends on making the product more secure and usable will benefit customers.

To see an example of this, one needs to look no further than IPsec in Windows 2000 and higher. IPsec is arguably one of the most useful security technologies available in Windows and many other non-Windows operating systems. For example, IPsec was one of the fundamental protection mechanisms used in Microsoft's successful entry in eWeek's OpenHack IV competition in 2002.

OpenHack

OpenHack is a recurring competition organized by eWeek magazine. One or more systems are configured and connected to the Internet, and the public is invited to try to break into them. Microsoft has participated in three of these and has come unscathed out of all three.

For more information on how the Microsoft entry in OpenHack IV was protected, see http://msdn.microsoft.com/library/en-us/dnnetsec/html/openhack.asp.


The IPsec protocol is incredibly versatile. It is also, at least in Windows, the poster child for user unfriendliness. Most people never get over the clunky user interface. If you manage to get over that, you usually run into one of the truisms about IPsec: It is a lot better at blocking traffic than it is at allowing traffic. There are few analysis tools to help figure out why traffic is not making it through. In Windows Server 2003, the network monitor was enhanced to allow it to parse IPsec traffic, greatly decreasing the troubleshooting effort customers need to invest to understand IPsec. With more effort expended by Microsoft at making IPsec usable, the deployment effort expended by customers would go down greatly, thus decreasing the cost to make networks both secure and usable. What we have is a teeter -totter effect between vendor cost and customer cost (see Figure 1-3).

Figure 1-3. Balancing between vendor cost and customer cost.


What this really means is that you very often get what you pay for. A product that costs more should also be more secure and usable/useful than a product that costs less. Other factors come into play here, but these tradeoffs hold in general.



Protect Your Windows Network From Perimeter to Data
Protect Your Windows Network: From Perimeter to Data
ISBN: 0321336437
EAN: 2147483647
Year: 2006
Pages: 219

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net