Come Together (Right Now)


Let's pretend that the advice given in this chapter is sound. Even if you accept the recommendations wholesale as worthy, the act of aligning information security and software development is a serious undertaking (and not one for the faint of heart). Close cooperation with the development organization is essential to success. If infosec is perceived as the security police or "those people with sticks who show up every once in a while and beat us soundly for reasons we don't understand" by dev, you have a problem that must be addressed (see the box The Infosec Boogey Man).

In many cases, dev is more than willing to accept guidance and advice from information security people who know what they're talking about. One problem is that dev doesn't know who in information security to talk to, who might help them, and who might just be a blowhard security weenie. To fix this problem, the first step for any information security professional who wants to help out with development efforts should be to reach out to the developers, roll up your sleeves, and offer to assist.

Once you have made dev aware of your willingness to help, consider taking small steps toward the goals laid out in this chapter. Rather than trying to become involved in every phase of a giant world-changing endeavor all at once, try one at a time. Be careful to not overwhelm the overall system by attempting to make too many changes at the same time. (Much more about this and about adopting software security in large organizations can be found in Chapter 10.)

The Infosec Boogey Man

In too many organizations, infosec shows up at the end of a long and strenuous product development march, calls the baby ugly, and stops everything in its tracks. Though shipping ugly babies is not really a good idea, handling things this way engenders hard feelings among developers every time. Imagine busting your hump to get a product completed almost on time and just about kind of on budget (for months or sometimes years), and then having some outsiders come along and impose some kind of mysterious new requirements on your system that you never heard tell of before. To make matters worse, these new requirements are a serious imposition that will take time to addressheck, half of them require architectural-level changes. Does that make you feel all warm and fuzzy? Of course not!

In my work as a software security consultant I have seen the "ugly baby" problem rear its (um) ugly head far too often. Gaining the trust and understanding of the development organization is something that needs to happen early in the lifecycle. Waiting until the end to carry out a penetration test or even a hard-core risk analysis (which is likely to result in the exposure of gigantic security issues that need to be fixed) is just like showing up out of the blue and beating a victim with a stick. Software security is better introduced slowly, methodically, and gradually than with explosions, much trumpet-blaring fanfare, and thumping of chests.


Another positive step is for the information security troops to take the time to learn as much as they can about software development in general and their organization's software development environment in particular. Study and learn about the types of applications that your software people develop; why they are working on them (i.e., what business purpose software is being built for); what languages, platforms, frameworks, and libraries are being used; and so on. Showing up with a clue is much better than showing up willing but clueless. Software people are not the most patient people on the planet, and often you have one and only one shot at getting involved. If you help, that's great. But if you hinder, that'll be the last time they talk to you.

In the end, success or failure is as likely to be driven by the personalities of the people involved as anything else. Success certainly is not guaranteed, even with the best of intentions and the most careful planning. Beer helps.

Coder's Corner

Ken van Wyk tells an interesting story about an enterprise security assessment he performed for a major financial services company. During the assessment, he uncovered a software security problem that could easily have been avoided had there been better coordination between the software developers and the people who deployed and ran the software.

The software that Ken was asked to review was an application that controlled a phone switch system running on a SCO UNIX system connected to the company's internal data network. He began by looking at the virtual environment that the application was running in. (By the way, this approach remains the quickest and easiest way of compromising an application.) In short order, Ken discovered that there were large numbers of OS-level weaknesses that enabled him to get shell access on the UNIX phone switch controller. Once he was "inside," things got worse.

Turns out that the software developers who wrote the controlling application had ported the application from MS-DOS to UNIX. By itself, that's fine, except for the fact that they had evidently taken the path of least resistanceget the application to run and then you're done. MS-DOS, being a single-user, single-tasking operating system, didn't provide much of anything in the way of file access controls, whereas UNIX, being a multiuser, multitasking operating system, did. The software developers apparently failed to spend the time to learn much of anything about the OS that they were porting their application to. This was evident because all of the application's files and directories were left unprotected at the operating system level (all files were mode 666 or 777).

The problem with this approach should be pretty obvious. Once logged into the phone switch controller, any user (or attacker) had complete read/write access to any component of the phone switch system, from its executable files to its configuration data. Ken "owned the farm," as we sometimes say in the security assessment world.

All of this could have been easily avoided. The developers made several flawed assumptions about the operational environment of the phone switch controller. These flawed assumptions would have stood out in stark relief if the developers had spent just a few minutes talking with some IT security people when they were porting the application to UNIX. Further, putting in place even some basic file and directory access controls on the switch controller would have required only a modicum of UNIX filesystem knowledge.

Effective access controls would have made a big difference, adding a very useful additional layer of protection for the application and its data. Of course, other security issues also required attention, but addressing the application's environment was the lowest of low-hanging fruit.





Software Security. Building Security In
Software Security: Building Security In
ISBN: 0321356705
EAN: 2147483647
Year: 2004
Pages: 154
Authors: Gary McGraw

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net