Executive Commitment

Mrs. Smith, the CEO and founder of Internet Software Design (ISD), turned the company from an idea written on a napkin into an overnight success. This Fortune 500, cutting-edge Silicon Valley company was breaking new ground and blowing the doors off the competition. Being in the Internet software design business, the company made computer security a top priority. Mrs. Smith continually stressed her commitment to computer security to her executive management team. She was very well known for her in-your-face style and she always got what she wanted. Well, almost always.

Like many CEOs who issue orders and expect them to be followed, Mrs. Smith assumed that her worldwide company network was secure that is, until one day when a hacker broke into the company's finance network. Undetected by support staff, the hacker transferred all of the company's financial data to another system on the Internet. When the transfer was complete, the hacker emailed Mrs. Smith's financial status (including forecasted earnings) to Fishman & McDonald Investors.

Fortunately for Mrs. Smith and company, a manager at that securities firm immediately reported the e-mail contents to Mrs. Smith's Chief Financial Officer (CFO), Charles Winifred. That report was the first indication Charles had of a network security breach, and it left him with many unanswered questions. Charles wanted to know how the system had been broken into. He wanted to know why his support staff didn't detect the unauthorized access to the data. And, of course, he wanted to know who was responsible for the theft and disclosure of information. Basically, he wanted answers and he wanted them now.

Charles had assumed his finance network was secure. After all, isn't that what they paid the system administrators to do? How could they be so negligent? And, why didn't they notice the security breach before the data was disclosed over the Internet?

However, Charles missed an important concept in accountability. In the end, it is management that is responsible for the reliability and integrity of the data on corporate networks not the system administrators. It is the executive managers, in particular, that the auditors and stockholders will hold accountable. If the company's forecasted earnings are posted all over the Internet, the auditors, stockholders, and news reporters are going to go after the executives at the top, not the system administrators.

To better illustrate the roles of management in computer security, let's take a closer look at the events before and after the ISD financial information was disclosed.

Day 1: Unsecured Systems

At Charles's urging, ISD's internal security expert, Martin Patterson, was called in immediately to conduct a security audit. Martin was one of the five members of ISD's security team and arguably the best security guru in the company. He took any level of security breach seriously, always giving incident response top priority in his work queue. Basically, Martin immediately stopped whatever he happened to be doing and pounced on each security incident with the ferocity of a pit bull.

Martin began his audit by probing the finance systems for information and testing the network for security vulnerabilities. It took Martin less than an hour to get the facts, which were quite shocking. For a company that was so vocal about its commitment to security, the actuality was pretty appalling.

Martin found that the corporate systems were clearly installed right out-of-the box without configuring security. Mission-critical systems were mislabeled and poorly protected, putting the entire network into a high-risk zone. Overall, the network had so many security holes that it could have been the target at the end of a busy day at the rifle range. And, these systems were maintaining the company's most confidential financial data!

So far as Martin could tell, the systems were wide open, with no auditing or monitoring mechanisms installed. There was plenty of easy access, and the chances of getting caught were slim to none. Anyone with a little security knowledge could have a field day on the network.

Charles also asked Martin to find out where the e-mail message containing the forecasted earnings originated. So, after testing the systems, Martin tried to trace the e-mail message. He figured his attempt would be fruitless. And it was. Martin got nowhere in his attempt to follow the hacker home.

Although CFOs might be hesitant to believe that an e-mail message can be untraceable, I was hardly surprised by Martin's results. It's pretty easy to spoof Sendmail and make an e-mail message seem to originate from someone else. My 13-year-old sister Laura could handle the job with no problems.

In any case, spoofed mail is nearly always a dead end in the hunt for a hacker. When you hit that, you simply rate the hacker's creativity in inventing domain names and move on. So, that's what Martin did.

Martin completed his audit and summarized his findings in a classified management report. Then he prepared for the really hard part delivering the report to management. Thankfully, we're past the point in history when messengers are literally shot for delivering bad news. But there are still a lot of figurative arrows flying. Delivering a high-risk security report can earn you a cold glare or demotion just as easily as a pat on the back. Luckily for Martin, Charles was a pat-on-the-back kind of guy.

As much as Charles appreciated Martin's thorough job, he was absolutely shocked by his findings. Charles had genuinely believed that all systems on the network were secure. That's what all the executive management staff had assumed. However, the audit showed just how easily information could be changed, stolen, or destroyed, without a shred of evidence to trace the intruder. Charles thanked Martin for delivering the facts (Martin got to play hero this time!) and immediately ordered the next level of management to fix the problems.

A Year Later: Unauthorized Access Continues

Over the next year, there were several successful break-ins on ISD's intranet (successful for the hacker, that is). The only good aspect was that Charles received his news about the break-ins from ISD's internal audit manager and not from CNN.

Keeping break-ins out of the headlines is a major goal of most CFOs, and a lot harder than it seems. Many hackers today actually make it a point to report their break-ins to news agencies themselves. Hackers know that the collateral damage from bad publicity is often worse than the damage caused by the attack itself. In other cases, the embarrassment of making the attack public is the actual point of the attack. Charles felt lucky that at least his embarrassing security problem was kept relatively private.

Lucky or not, Charles was still in a bad situation. He was infuriated, and still surprised, that his network was still vulnerable. Hadn't he ordered his staff to fix the problem last year? Didn't anyone do what he told them to? At this point, Charles was looking for heads. And I don't mean that he wanted to increase head count. Charles wanted those heads on the chopping block.

At this time, Charles met with the company's CIO and internal audit director to discuss the security risks. They decided that it was time to hire an independent security auditor. That's where I stepped in.

As I entered the picture, I already had plenty of information from the previous audit. What a bonus! Usually, an auditor spends a lot of time interviewing staff, looking at network diagrams, and probing for information to discover which systems could be vulnerable.

I knew which systems had been vulnerable the previous year, so that seemed like a good place to start testing for several reasons. First and foremost, I used this approach because I could build statistical information from hard facts. Executives love statistics. Anything that I can put into a graph or pie chart turns me on, because I know that transferring information to executive managers in this format adds value.

Most executives I work with are very smart. But they also have so much information flowing past them that they expect, and need, accurate, understandable information that gets to the point in one page or less. To do that, the executive summary report needs to make sense at a glance. Having said that, I've seen security audit reports that confused the hell out of me. Flashing a poorly written and badly structured report past a top-level executive not only is pointless but also negates the usefulness of the work done to produce the audit. Because addressing risk and approving funding so often go hand-in-hand, it is crucial that top-level executives understand the risks and potential consequences. For that reason, executive management reports need to be short (ideally one page and never more than two pages), easy to read, and easy to understand.

The results of this audit would be easy for me to transfer to management. I visualized the graph even before I started the audit. I would match the vulnerabilities from last year against the percentages found this year. This would be great! I stored that thought in my memory and started my audit.

I began by reading the audit report with Martin's findings from a year ago. It was difficult for me to read the report. All of the risks were reported, but in a technical fashion with no logical flow. If management received a report like this, they would have no idea where to start. It took me more time than I had planned to dig the real information out of the report.

After puzzling through Martin's report, though, I did have a good idea where the high-risk systems were on the finance network. I probed those systems first for information. Next, I pulled down a copy of the password map and started running Crack on the passwords. I like to start cracking passwords at the beginning of my audit just to see how many I can crack right away. This password map was rather large 520 users. Surely I'd be able to hit at least a few passwords. And I did. Checking the crack.out file showed 10 passwords guessed right off the bat. I'd figured as much. Leaving the checking of additional Crack results for later, I focused my audit on the high-risk systems.

The system administrator gave me access to all the systems. When performing an audit, I prefer to log into a system to test security rather than break in from the network. When I first started auditing, I loved trying to break in from the network first (a penetration test), because it was exciting and it helped me to build my break-in skills. As I became more proficient at auditing, I found that I could cover more territory faster and more effectively by requiring the owner of the system to give me a login account. Then, I would log into the system to check for security vulnerabilities. To that end, sometimes I don't run a penetration test at all. First, I probe the systems for information from the Net (just to see how much information I can get). Then, I test for bad passwords. After that, I log in and test for vulnerabilities and configuration errors. The final audit test I run is a penetration test from outside (only when necessary).

I don't believe that a penetration test is always needed. For example, consider a system that turns out to have an old version of Sendmail. It's a well-known fact that such a system can be broken into. Why waste time essentially proving that water is wet?

In some cases, I do run a penetration test on systems known to be vulnerable just to show proof of concept to management. In other cases, proof of concept isn't needed. It all depends on the scope of the audit, client priorities, and management expectations.

In this audit, a penetration test was clearly not necessary. Management knew that the network could be broken into. (I was there because hackers knew that, too!) The real issue in this audit was why the network was still vulnerable. Knowing that, I decided to punt on the penetration test and move on.

I proceeded to check the most critical finance system. It was wide open and had no security patches. I broke root by exploiting a very old security bug. It was easy to see that these systems were out-of-the-box installations. Clearly, no additional security had been configured. I tested a second system, then a third, and a fourth. Same story. So far as I could tell, absolutely nothing had changed since the last security audit was performed. Apparently, the people at line level (in the trenches) had not fixed the problems.

The $64,000 question, of course, was why not? Clearly the security problems at ISD should have been fixed. Line management either didn't hear the message from Charles at the top, or they chose not to listen.

The answer seemed to be that when Charles told his people, "Fix the security problems now," he considered the matter closed. He never checked that the order was carried out. For whatever reason, the problems were not fixed and Charles did not get the results that he wanted.

Speaking of results, I realized at this point that I was still running Crack. Wondering how many more passwords Crack was able to uncover, I checked the crack.out file again. Incredible! One hundred more passwords had been cracked. Even more amazing was that Crack wasn't done! It was still pounding away trying to guess passwords. It was obvious that the users had never been taught how to select good passwords. It was equally obvious that the system administrator had never bothered to check for bad passwords.

It makes me crazy when system administrators don't train their users. Far too often, systems are installed and users are assigned accounts without ever being taught the importance of password selection and maintenance. It is also fairly common for system administrators to forgo testing the passwords. Sometimes, they really don't have the time. Many times, however, they simply don't know how and are afraid or embarrassed to ask.

Additionally, bad passwords had been reported as a problem in the previous year's audit report. And, unlike some of the other problems reported, the bad passwords could have been fixed with very minimal effort. I guess no one felt that it was his or her job to make that effort.

It's kind of a shame that last year's audit report didn't specify how many bad passwords were found. I found it really hard to believe that it could have been any worse than this year. By the time Crack was done, it had broken a full 190 passwords on a system with only 520 users. Almost every other user was using a bad password. At that rate, it seems rather pointless to use passwords at all. Why not just broadcast the passwords on National Public Radio as reminders to any employees who had forgotten their middle names or birth dates?!

As it turned out, bad passwords were just the tip of the security iceberg facing the ISD network. The core problems all seemed to be focused in one area, however: security risks induced by humans. To get to the bottom of those problems, I began to interview employees.

To look for the communication breakdown, I started at the top level of management and worked my way down. Along the way, I made some illuminating discoveries:

  • The executive management team never requested, or received, a report on the status of any changes made to improve network security.

  • Executive managers simply assumed that security problems would be fixed because they asked that they be fixed.

  • The system administration department was understaffed and did not have the time to fix the systems.

  • The system administrators were working overtime simply to install new users and keep the company's systems online. As much as they wanted to address the problems, they just never found the time to get around to them.

  • The system administrators also did not know how to fix the security problems. They asked management for help, but that kind of training wasn't figured in the budget. So, the request was postponed for later consideration.

  • The line managers also requested additional staffing resources to secure the network. Of course, that wasn't in the budget at that time, either. Again, final action was postponed for further consideration.

A year later, funding for new staff still had not been approved. In the meantime, the line managers put a hold on trying to fix the security problems until the new head count was approved. Bottom line no one did anything but wait.

It's amazing how much you can find out when you interview the staff. The sad part in this story is that the line managers knew that their systems were still insecure. However, top management seemed clueless. That's because top management didn't ask anyone to report back on problem resolution. Line management knew that resolution was not forthcoming, but they didn't take the initiative to report back. As a result, top management remained clueless. They honestly felt they had dealt with the issue and moved on.

What happened in this case is not really that uncommon. Like many companies, ISD was downsizing. So, requests for increased head count were routinely denied. The line managers may also not have clearly articulated why that increased head count was absolutely necessary. Or, it may have been just one request in a crowd. As I'm sure you know, when infighting is common for limited positions, every requested head-count position suddenly becomes absolutely indispensable.

It's also possible that the requesting line manager was clear in the reasons for his or her request, but the reasoning became muddled by the time it traversed the four levels of management in between the requesting manager and the executive authorized to approve the funding. No doubt the funding request would have been approved if the CIO had directly received a request that said, "This funding is required to fix security vulnerabilities because your entire network is at risk. Until this position is filled, data can be easily stolen, modified, and destroyed."

Summary: Take an Active Approach

How could a Fortune 500 company's finance network be so vulnerable to attack? Poor management, not enough training, faulty communication, and a complicated reporting structure (too many layers of management).

Although Mrs. Smith's executives clearly voiced the importance of security, they never took action to make sure that security existed. Telling your staff to "fix problems" is not enough. Managers must take an active approach to security. At the very least, managers should request clear proof, in writing, that identified security problems have been fixed. In this case, such a report would have let management know that security issues were not being addressed, because funding had not been approved for additional staff.

In many cases, security comes down to funding. The importance of the data you are trying to protect typically determines how much you need to spend to protect it. Often, systems remain at risk for one quarter after another simply because no one thinks to budget for security until after a break-in occurs.

In this scenario, Mrs. Smith was extremely lucky. The company's data could have been destroyed and its systems shut down for several days. Mrs. Smith was also fortunate that the break-in was kept quiet. This is not the kind of press coverage any CEO wants to see on CNN. The fallout from the bad publicity could well overshadow the damage caused by the actual break-in.

I am often asked to speak to executives about Internet and intranet security. When I do, I often discuss the case of ISD. I find myself repeating, "Yes, it did happen. And, it probably will again." To drive the point home, I also point out, "ISD was a billion-dollar-a-year company. If it could happen to them, why do you assume that your network is immune? Do you know what the security is like on your network? When was the last time you received an executive-level security summary?" At this point, much of my audience is usually sweating.

For your own piece of mind, try to avoid sweating problems after the fact. Instead, take an active approach to security.



IT Security. Risking the Corporation
IT Security: Risking the Corporation
ISBN: 013101112X
EAN: 2147483647
Year: 2003
Pages: 73

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net