Unsafe Network

Gerald Pushman was hired to lead a top-secret project by the Chambersburg Museum of Art. He was a new manager at the museum, but not new to secret projects. He was quite used to them and he knew what it took to keep top-secret projects secret.

Gerald was concerned about physical security and the security configuration of his computer systems. Since he had a hands-on management style, rather than simply handing off responsibility for systems security, Gerald met early on with Kirsten Smith, the network administrator.

Kirsten had worked at the museum for several years and knew every inch of the network. Gerald told her that because of the sensitivity of the data, he was concerned about the network in which the new systems would be installed.

Kirsten told Gerald up-front, "If you are connecting systems to the network, I would worry about the installation and configuration of the systems, especially if it's a top-secret project." Talking to Kirsten, Gerald discovered that the museum's database servers were wide open for anyone to access. Those servers contained incredibly sensitive information. Not only were the art values accessible, but so were the dates and times of art arrivals and departures, as well as the transportation plans. A high-tech art thief, posing as an employee with access to even one system on the network, could easily use that information to hijack art on its way to or from the museum.

Gerald was lucky that Kirsten was honest enough to provide so much information. He probed her knowledge further and found that the security team and system administrators had been battling over the security configurations for years. Since they'd never agreed on an approach, most of the systems in the network had never had any security configured at all.

Gerald knew that until the security team and system administrators resolved their differences, the security of his systems would be compromised. Given their history, that resolution was unlikely to come overnight. As a temporary work-around, Gerald decided to install his systems in a secured room on their own network away from the museum's network. Imagine having to keep a project off your own network to prevent it from being stolen or sabotaged! Let's have a look at how the situation got that bad.

In the Beginning: Bypassing the Corporate Network

For Gerald to keep his systems off the museum's network, he had to start his own network and hire his own system administrator. That added some pretty hefty dollars to his budget. So, Gerald had to meet with the museum's executive staff for approval.

Needless to say, they were not overjoyed at this approach. In truth, they simply couldn't believe that their own network was insecure. Eventually they did give Gerald what he wanted. But they made him jump through hoops to get it. And, as part of the package, they demanded that he provide them with proof that the museum's network wasn't secure. That's when I entered the picture.

Day 1: Collecting Evidence

Gerald gave me all the background information. That is, he filled me in on the ongoing feud between the system administrators and security group. I talked to Kirsten at the same time, and she informed me that all the data on the network was at risk.

With that information in hand, I knew that I had a tough job ahead. First, I needed to find out whether the systems really were insecure. If so, I then had to find out why.

Obviously, management (i.e., Gerald) felt that the network systems were insecure. He was going by word of mouth, however. He didn't have any tangible evidence. My audit needed to provide that tangible evidence.

Since the whole point of this audit was to provide proof of concept, I decided that I needed to conduct system-level audits, interview the staff, and perform penetration tests.

In this case, Gerald lucked out because Kirsten provided him with a lot of information. That's not always the case. Sometimes, an entire network will be at risk and the support staff won't say a word. Given that, I had no idea whether the rest of the support staff would be as forthcoming with details as Kirsten.

In some audits, it's best to collect as much evidence as possible before you interview the staff. That way, you can use the evidence as leverage if the staff is reluctant to share information or acknowledge that a problem exists. Since this seemed to be one of those audits, I decided to collect the data before starting the interviews.

Kirsten set me up with an account on the network and gave me a network diagram that was supposed to identify the high-risk systems. The network map looked reasonable. Since I like to begin security testing on the highest-risk systems, that's what I looked for on the map. However, I couldn't really tell from the map which systems were in this category.

I talked to Kirsten again. She pointed out a handful of systems that she considered high risk. We talked it over for a while until I was sure I wasn't missing any. With my list of mission-critical targets, I was ready to begin my audit.

I'm always amazed when I can access my first target system without even giving it a password. It feels like walking up to an ATM machine that hands you money before you even take your card out of your wallet. This was one of those times.

The first system I picked was obviously configured to trust the system on which I'd been given an account. Once I was into the main database server, I was able to get into all of the other mission-critical systems. I didn't even have to break a sweat. The web-of-trust between these systems was astounding. They all trusted the first system I broke into, so I was into one system right after another.

Obviously, someone wasn't paying attention when the systems were set up. Unless of course, they simply trusted every single person who had access to the museum's network. In today's world, that level of naivety can land you in a lot of trouble! According to Michael Anderson, computer forensics specialist and former U.S. Treasury agent, 85 percent of industrial espionage originates within the target company.

On the plus side, the museum support staff did run regular backups and rotate the tapes offsite for safekeeping. Still, making data freely available to anyone on the network is not a good idea. This was exactly the kind of risk that I needed to show to the museum's executive staff.

It took me most of the day to get into the important systems and collect the evidence I needed to back up my final report. The breaking-in part of this audit was very simple. It just took some time to compile all the data points.

When all was said and done, my list of risks looked like this:

  • The security configuration on mission-critical systems was inadequate.

  • The systems themselves were not classified (noncritical, mission critical, etc.).

  • Access control was not configured.

  • Root access was easily obtained.

  • Passwords were easily guessed.

  • Security patches weren't installed.

  • Intrusion-detection mechanisms weren't enabled to prevent, detect, or report unauthorized access to proprietary information.

  • Audit trails simply did not exist.

  • Excessive file permissions existed.

  • Unnecessary network services were running.

In short, Kirsten was right. The systems were wide open. Not one of the high-risk database servers had any serious security. Why? That was the answer that I needed in order to complete my audit.

Kirsten, of course, gave me her opinions on that matter. But I needed facts. Since it was Friday afternoon, I'd have to wait till next week. It was strange to start an audit at the end of the week, but that's just how it fell into my schedule.

I was in the process of packing up my things when I remembered that I had to fly home to meet my sister on Saturday. I was taking her to San Francisco for one of my spoil-the-kid weekends. She's much younger than I, 13 years old, so I make it a point to schedule special time with her as my work life permits the types of weekends where she gets spoiled and I spend money. It's kind of like being a grandparent without having to grow old first. This weekend we were meeting in San Francisco to visit as many museums as possible. Among other things, the girl is an artist (and a good one at that), and she loves going to the museums and art galleries. It would be nice to spend the weekend with her and see the museums from a different point of view not in terms of risk and data flying all over the place because of unsecured systems, but simply as art for the sake of art.

Kirsten walked me out to the lobby and said she'd meet me back there on Monday at 9:00 a.m.

Day 2: System Administrators Versus the Security Team

Weekends always go too fast. Before I knew it, I was back in the lobby waiting for Kirsten and ready to finish my audit. I was a bit anxious.

I knew that in the second part of my audit I'd have to prove why the systems weren't secure, which meant interviews. I don't mind interviews and meeting people, but from what Kirsten said, I was walking into a war about configurations and policies and procedures that had been going on for years.

The good news was that I had a lot of energy after the weekend. Too often, I find interviewing depressing. It brings me down to talk to people who often don't care about the data they're being paid to protect.

I felt certain that the war between the security team and system administrators was behind the risks I had found. Soon, I'd know that beyond a doubt.

Kirsten had scheduled me for interviews with all of the relevant players. She was also kind enough to give me a few free hours in the morning before the interviews started. I appreciated that. (Who knows what those guys looked like before their morning coffee?)

Before entering the war zone, I decided to review the policies and procedures published by the security team. Usually, I can get a pretty good feel for a company's attitude toward security by reading the policies and procedures. A company that doesn't have good policies and procedures usually doesn't have good security.

I found several problems with the policies and procedures. First, they were difficult to read and understand. My gut feeling was that the system administrators probably didn't configure security because they couldn't understand the policies and procedures. The policies and procedures were also seriously out-of-date. The last update appeared to be nearly three years old. As a result, some policies weren't even technically correct. As a matter of fact, if you followed one of the document procedures, you would actually add a security violation to the system and make it more vulnerable to attack.

Reading through the materials, I got the feeling that the documents had initially been drafted by someone who understood the importance of policies and procedures. However, I also got the feeling that person had since left the security team or maybe even the museum.

Now I was ready to meet the staff. Unfortunately, my first meetings were group meetings. Group meetings are often tense even when there isn't a war in progress. I proceeded to the first meeting, keeping in mind that I could always talk to the key players one-on-one later if I needed to.

Who Owns Security

I met with the system administrators first. Since they were responsible for configuring the security on the systems, I wanted to hear their side first. And, of course, the system administrators are always seen as the key culprits when a security problem surfaces.

I started with a pretty basic question: "What procedures are used to configure security?" Incredibly, the response was, "None." Their "procedure" was to simply bring the systems online without any special security precautions.

I pressed on, "Aren't you responsible for configuring the security on the systems?" They said, "Yes, but the security team is supposed to tell us how to do that. Since their policies and procedures make no sense, we have no idea how to configure the systems."

I followed up by asking who had been working there the longest. One of the system administrators raised his hand and said, "Five years for me."

"And has this been a problem for that long?" I questioned.

"Yeah, I guess so," was his answer.

Incredible! Obviously, these guys knew that their networks had been risky for years and didn't seem in any hurry to resolve the problems. I tried to stress that the system data needed to be protected now and not another five years down the road, but given their history, I didn't expect any quick resolutions.

Transferring Responsibility

Next I met with the security team. Like the system administrators, the security staff also knew that the network was not secure. Of course, they placed the blame on incompetent system administrators. When I asked them when they had last shown the system administrators how to configure security, they informed me that the policies and procedures could be downloaded from the Net. "Anyone who knows what he's doing should be able to figure it out!" I was told.

In talking to the security team, I also confirmed my earlier suspicions. Yes, the fellow who wrote the security policies and procedures had left the company two years before. They had recently assigned someone to look into updating them. Unfortunately, simply updating the procedures at this point was too little, too late. The systems were already wide open and the policies were virtually incomprehensible.

I continued to question the security team, system administrators, and management, but most of the answers reiterated the following problems:

  • The security team had been responsible for initially writing the policies and procedures. No one was responsible for updating them, however.

  • In theory, the policies and procedures were posted to a server for easy access by the system administrators. In reality, no one told the system administrators how to get to that server.

  • Any system administrators who had been lucky enough to locate the policies and procedures had been unable to understand them. Basically, the policies and procedures were too confusing and poorly written to be of any use.

  • Management obviously did not view policies and procedures as important.

Summary: Security Is the Casualty of War

Security policies are the first line of defense. Without them, your company will be at war. Not only will there be battles between the different support organizations, but you could be battling hackers interested in a different kind of war. There will be no politics on their part, just a raw desire to change, steal, or destroy data. When that type of war starts, it no longer matters who wins the little cross-department battles. Given the amount of energy focused on the internal politics, I'd also wager that the hacker would probably win any external battles.

If your company does not have policies and procedures, assign someone to create and maintain them. If no one on your staff seems up to the challenge, bring in a hired gun for the assignment. Better yet, hire someone to teach your people how it's done. At the least, buy a book on the subject to provide some frame of reference for starting.

Once they're written, also make sure that the policies are kept current. Outdated policies are worse than useless because they give the appearance of security where none exists.



IT Security. Risking the Corporation
IT Security: Risking the Corporation
ISBN: 013101112X
EAN: 2147483647
Year: 2003
Pages: 73

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net