15.4 Operating Securely

only for RuBoard - do not distribute or recompile

15.4 Operating Securely

In general, the longer a computer is used, the less secure it becomes. New software gets installed, increasing the complexity of the system and increasing the chance that a vulnerability will be introduced. Sometimes a security measure might be disabled by an administrator who is looking to get something done quickly. Meanwhile, vulnerabilities with the existing system are more likely to be discovered. Operating systems that were believed to be absolutely secure one day can be fatally vulnerable the next.

Thus, if you spend the time and money to deploy a secure system, but you do not maintain the system, you are wasting your resources. Organizations that hire security consultants are often the most guilty offenders: these organizations frequently bring in some high-powered consultants for a brief engagement. The consultants write a report and leave. Even if the consultants actually succeeded in making the systems more secure, the beneficial results are only temporary.

15.4.1 Keep Abreast of New Vulnerabilities

In today's environment, you must stay abreast of newly discovered vulnerabilities if you wish to maintain a secure computer that is connected to the Internet. The day has long passed when security vulnerabilities were kept quiet. These days vulnerabilities are usually publicized with breathtaking speed once they are discovered. What's more, once a vulnerability is known, exploits are quickly developed and distributed across the Internet. In many cases, system administrators only have a few hours between the time that a vulnerability is first publicized and the time when they will start to be attacked with it.

One way that you can minimize the impact of newly discovered vulnerabilities is by isolating your web server from the Internet using a firewall or IP filtering (see Section 15.6 later in this chapter). But this isolation is not a cure-all, because vulnerabilities have also been discovered in the hardware or software that you use for the isolation. Also, some flaws exploit protocols you need to allow through your firewall. There is no substitute for vigilance.

15.4.2 Logging

Many of the services on networked computers can be configured to keep a log of their activities. Computers that run the Unix and NT operating systems can have their logging systems customized so that events are written into a single file, written to multiple files, or sent over the network to another computer, a printer, or another device.

Logs are invaluable when recovering from a security-related incident. Often they will tell you how an attacker broke in and even give you clues to track down who the attacker is. Log files may be submitted as evidence to a court of law if they are kept on a regular basis as part of normal business practice.

You should have logging enabled for all of your servers and you should be sure that these logs are examined on a regular basis. You may wish to write a small script that scans through the log file on a daily basis and filters out well-known events that you are expecting, or use a log analyzer, as mentioned earlier. The events that are left will be, by definition, the events that you are not expecting. Once you have a list of these events, you can either go back to modify your scripts to suppress them, or you can make phone calls to figure out why they have occurred.

Although some programs maintain their own log files, most programs on both Unix and Windows now use the system-wide logging facilities that these operating systems provide. A notable exception to this rule is web servers.

Log files are also useful for gauging the capacity of your system. For example, you might consider logging all of the following parameters on a regular basis. They will not only help you spot security violations, but also help you determine when your systems need to be upgraded:

  • Utilization of your external network connection

  • Utilization of your internal network

  • CPU load of your servers

  • Disk utilization

See the discussion of the privacy implications of log files in Chapter 8.

15.4.2.1 Setting up a log server

If a person breaks into your computer, the first thing that they will do is to cover their tracks by either erasing or subtly modifying your log files. As logs are normally kept on the computer that has been compromised, they are vulnerable. The only way to protect your logs is to set up a secured log server that will collect log entries from other computers on your network.

A log server is a single, secure computer that you set up to receive log messages from other computers on your network. This log server can be either inside or outside your firewall; you may even wish to set up two log servers, one on each side.

Your log server should be a computer system that offers no services to the network and does not have any user accounts. The idea is to make the log server the most secure computer on your network even if an attacker breaks into all of the other computers on your network, you want the log server to remain secure.[9] Log servers should be used in addition to local logging, not as a substitute.

[9] The previous edition of this book noted that an alternative to setting up a single-purpose log server would be to have a form of write-only log storage such as an obsolete PC connected to your log host using a serial cable. This computer could simply record the log entries in a disk file and display them on a console. In the event that the log server was broken into, you would still have a record of log events on the PC. Alas, these kinds of home-brew logging systems are now simply too complex for most organizations to manage, although they still work well.

15.4.2.2 Logging on Unix

The modern Unix logging system assigns a facility and a priority to each log message. The facility specifies which part of the Unix system generated the logging message. Typical facilities are kern for the kernel, news for the Netnews system, auth for the authentication system, etc. Priorities indicate the level of the message's severity info for an informative message, alert for alerts, crit for critical messages, and so on.

The Unix file /etc/syslog.conf specifies what the operating system should do with logging messages. Messages can be appended to a file, sent using interactive messages to users who are currently logged on, transmitted to another machine, sent to a program of your own design, or some combination of all of these.

Unix log files need to be pruned on a regular basis, or else they will overwhelm your computer's disk. One way to prune log files is to rotate them. Typical rotation procedures involve copying existing log files to new locations, compressing the files, and storing a few copies. The Unix command newsyslog automates this process. It is controlled by the configuration file /etc/newsyslog.conf.

15.4.2.3 Logging on Windows 2000

Logging on Microsoft Windows 2000 systems is controlled by the logging service. Auditing is disabled by default on Windows 2000 Professional and on some versions of Windows 2000 Server. You should enable auditing to catch bad login attempts and to watch the IP services you are offering. Be careful, though: you can generate a lot of information very quickly. Logs are kept locally, and should be reviewed on a regular basis. Windows log files are automatically pruned on a regular basis by the logging service. To enable auditing, run the Local Security Policy application in the Administrative Tools folder of the system control panel. Click on Local Policies Audit Policy; then double-click on the policy that you wish to change (see Figure 15-1).

Figure 15-1. Enable auditing from the Local Secure Policy Setting application.
figs/wsc2_1501.gif

To view the contents of the log, run the Event Viewer application. You can change the retention time of log events by selecting the Properties item from the Action menu (see Figure 15-2).

Figure 15-2. Run the Event Viewer application to view the contents of the log.
figs/wsc2_1502.gif

15.4.3 Backups

A backup is simply a copy of data that is written to tape or other long-term storage media. Computer users are routinely admonished to back up their work on a regular basis. Site administrators can be responsible for backups of dozens or hundreds of machines. Chapter 11 discusses backups in some detail.

Backups serve many important roles in web security:

  • They protect you from equipment failures.

  • They protect you from accidental file deletions.

  • They protect you from break-ins, because files that are deleted or modified by an attacker can be restored from a backup.

  • They allow you to determine the extent of an attacker's damage, because you can detect changes in your system by comparing its files with the files stored on your backup tapes.

Backups systems are not without their problems, however:

  • You must verify that the data on the backup tape is intact and can actually be used to restore a working system. Otherwise, your backup may lull you into a false sense of security.

  • Look closely at systems that back up several computers across a local area network. These systems frequently give the computer that is running the backup server considerable control over the computers that are running backup clients. If the computer that initiates the backups is broken into by an attacker, then any system it backs up may be compromised as well.

  • Check whether the files that are sent over the local area network are encrypted or transmitted in the clear. If they are transmitted without encryption, then an attacker who has access to your network traffic could learn the contents of your systems simply by monitoring the backup.

  • Backup media, by definition, contain all of your files. Backup tapes and disks should be protected at least as well as your computers. You may also wish to consider encrypting your backup media, so that in the event that it is stolen, your data will not be compromised.

  • Be careful with access control lists (ACLs) in an NT environment. Nonadministrative users who have rights to perform backups also have the ability to examine any file in the filesystem. Furthermore, if they can restore files, they also have the ability to substitute personalized versions of user database and registry information.

Fortunately, the risks associated with backups can be managed. Make backups of your system on a regular basis. These backups should be stored both on-site and off-site, and they should be guarded to protect the information they contain.

15.4.4 Using Security Tools

A security tool is a special program that you can run to evaluate or enhance the security of your site. Many security tools that are available today were developed at universities or by independent specialists and are freely distributed over the Internet. There are also several good tools that are marketed commercially.

There are five kinds of tools that you should consider using:

  • Tools that take a snapshot of your system and look for potential weaknesses

  • Tools that monitor your system over time, looking for unauthorized changes

  • Tools that scan your network, looking for network-based weaknesses

  • Tools that monitor your system and network to identify attacks in progress, including attacks by malware

  • Tools that record all network activity for later analysis

Automated tools are (usually) a low-cost, highly effective way to monitor and improve your system's security. Some of these tools are also routinely employed by attackers to find weaknesses in sites around the Internet. Therefore, it behooves you to obtain your own tools and use them on a regular basis.

15.4.4.1 Snapshot tools

A snapshot or static audit tool will scan your system for weaknesses and report them to you. For example, on your Unix system a tool might look at the /etc/passwd file to ensure that it is not writeable by anyone other than the superuser. Snapshot tools perform many (perhaps hundreds) of checks in a short amount of time. One of the best-known programs and the first generally available was developed for Unix: COPS, written by Dan Farmer with assistance and supervision by Gene Spafford. Unfortunately, COPS is now several years out of date and has not been updated in a long time.

A more up-to-date Unix snapshot tool is Tiger, from Texas A&M University. Tiger runs on a wider variety of operating systems and is easy to install. Tiger performs nearly all of the checks that are in COPS, plus many more. Unfortunately, it produces a report that can be very difficult to interpret because of its length. It also is unclear if the tool is being maintained. You can find the most recent version at http://www.net.tamu.edu/ftp/security/TAMU/.

Several packages are available in the Windows world. The Kane Security Analyst (KSA) from Intrusion Detection, Inc. (http://www.intrusion.com/) will check passwords and permissions (ACL), and monitor data integrity. NAT is a free tool for assessing NetBIOS and NT password security made available by Security Advisors (http://www.secnet.com). Two tools for checking NT passwords are ScanNT, written by Andy Baron (http://www.ntsecurity.com/Products/ScanNT/index.htm), and L0phft Crack, by the "computer security researchers" at L0phft Heavy Industries (now part of @Stake).

A snapshot program should be run on a regular basis no less than once a month and probably at least once a week. Carefully evaluate the output of these programs, and follow up if possible.

Finally, be careful that you do not leave the output from a snapshot security tool in a place that is accessible to others: by definition, the holes that they can find can easily be exploited by attackers.

15.4.4.2 Change-detecting tools

It's also important to monitor your system on a regular basis for unauthorized changes. That's because one of the first things that an attacker usually does once he breaks in is to modify your system to make it easier to regain access in the future, and/or to hide evidence of the break-in. Scanning for changes won't prevent a break-in but it may alert you to the fact that your system has been compromised. As most break-ins go unnoticed for some time, change-detecting tools may be the only way that you can be alerted to an intruder's presence and take appropriate action.

When more than one person is administering a system, change reports also give users an easy way to keep track of each other's actions.

Some vendors have started to include automated checking systems as part of their base operating system. The BSD/OS operating system, for example, includes an automated tool that runs every night and looks for any changes that have occurred in the system configuration files that are in the /etc directory. To perform this change, the BSD/OS scripts use comparison copies; that is, they make a copy of every /etc file every night and use the diff command to compare the actual file in /etc with the copy. Called the "daily insecurity report," these reports are automatically mailed to the system manager account. Unfortunately, this system is easy for an experienced attacker to subvert, because the comparison files are included on the same computer as the original files. For this reason, more sophisticated systems are run from removable media. The media is also used to keep either the copies of the comparison copies of each file or their cryptographic checksums.

Tripwire is another tool that can automatically scan for changes. Originally developed by Gene Kim and Gene Spafford at Purdue University, Tripwire is now commercially maintained and sold by Tripwire Inc. for both Unix and Windows systems; see http://www.tripwire.com/ for details.[10] The program can store the results of a Tripwire scan on removable media or report the results to a central Tripwire management console. An open source version of Tripwire is available for Linux systems. At one time (and perhaps still), various versions of Tripwire constituted the most commonly used intrusion and change detection system in use in the world.

[10] The Tripwire for Web Pages offering is worth mentioning; it notes any changes to pages served by an Apache web server and alerts the administrator. This protects against unauthorized alterations or defacements.

15.4.4.3 Network scanning programs

You should use automated tools to scan your network. These tools check for well-known security-related bugs in network programs such as sendmail and ftpd. Your computers are certainly being scanned by crackers interested in breaking into your systems, so you might as well run these programs yourselves. Here are several we recommend:

  • One of the most widely publicized scanning programs of all time was SATAN, written by Dan Farmer and Wietse Venema. Before its release in April 1995, SATAN was the subject of many front page stories in both the trade and general press. SATAN derives its power from the fact that it can scan a group of machines on a local or remote network. You can use it to check the security policy for a group of machines that you are responsible for, or you can point it beyond your firewall and see how well your counterpart at a competing business is doing. SATAN has an easy-to-use interface[11] and a modular architecture that makes it possible to add new features to the program. Nevertheless, SATAN is not that powerful: the program scans only for very well-known vulnerabilities, many of which have been fixed in recent years. It is no longer maintained, and we're listing it here mainly for historical purposes.

    [11] SATAN was actually one of the first programs to be administered from a web browser.

  • Several companies sell commercial scanners. One of the most widely known of these companies is Internet Security Systems, Inc., which commercialized the freeware scanner by the same name. Scanners are also sold by Axent, Network Associates, and others. You can learn more about these programs from their vendors' respective web sites.

  • SomarSoft (http://www.somarsoft.com) offers several tools for analyzing information culled from Windows NT logs and databases. KSA, mentioned under Section 15.4.4.1, also provides analysis and integrity checking for NT environments. Likewise, some commercial virus scanning products can provide signature-based integrity checks for NT binaries and data files.

15.4.4.4 Intrusion detection systems

Intrusion detection system (IDS) programs are the operating system equivalent of burglar alarms. As their name implies, these tools scan a computer as it runs, watching for the tell-tale signs of a break-in.

When computer crackers break into a system, they normally make changes to the computer to make it easier for them to break in again in the future. They frequently use the compromised computer as a jumping-off point for further attacks against the organization or against other computers on the Internet. The simplest intrusion detection tools look for these changes by scanning system programs to see if they have been modified.

Intrusion detection systems can either be host based or network based. A host-based system looks for intrusions on that particular host. Most of these programs rely on secure auditing systems built into the operating system. Network-based systems monitor a network for the tell-tale signs of a break-in on another computer. Most of these systems are essentially sophisticated network monitoring systems that use Ethernet interfaces as packet sniffers.

Most working intrusion detection tools are commercial. Some representative systems currently available are:

  • Tripwire, described earlier in this chapter in Section 15.4.4.2.

  • Dragon, marketed by Enterasys, is a highly-rated and powerful intrusion detection system for hosts and networks (http://www.enterasys.com/ids/).

  • Cisco Secure IDS (Formerly NetRanger), which uses network monitoring to scan for intrusions (http://www.cisco.com/).

  • Realsecure Manager and Realsecure Network Sensor, by Internet Security Systems (http://www.iss.net/).

  • Shadow, a freeware network monitoring and IDS system created and maintained by the Naval Surface Warfare Center (http://www.nswc.navy.mil/ISSEC/CID/ ).

15.4.4.5 Virus scanners

There is a huge market for antivirus tools. Several major vendors market a variety of tools to be used on individual hosts, networks, mail servers, and more. Network Associates and Symantec appear to be the major vendors of these tools in North America. The product features and coverage change frequently, so it is difficult to say with certainty what is current and recommended. However, most antivirus tools detect roughly the same viruses in approximately the same manner, so we can almost say that any current version of products by these two companies will work.

Note that antivirus tools are not needed for Unix or Linux systems there are only three or four reported viruses for these platforms, and they do not spread well. An integrity monitor (such as Tripwire) will also perform any antivirus function needed on these platforms as a side-effect of the way it works.

Mac OS systems do not generally need an antivirus tool unless you are in the habit of using old media that may be contaminated, or you use a Microsoft product with macros enabled (e.g., Word, Excel, or Outlook). Historically, there have only been about 60 viruses discovered for the Mac in the last dozen years. None of these seem to be still in general circulation, although there are undoubtedly reservoirs on old, infected diskettes.

The majority of discovered viruses (more than 72,000 by some counts in late 2001) are for Microsoft software DOS, Windows, NT, and Visual Basic Scripting. Luckily, many of these are no longer in general circulation or work on currently platforms; perhaps only 300-400 are in common circulation. However, some vendors report new virus reports are at the level of 12-15 per day, the majority in macro languages. Thus, if you run any of these platforms, you need to have antivirus software in place and updated regularly.

15.4.4.6 Network recording and logging tools

Intrusion detection systems are like sophisticated alarm systems: they have sensors and alarms, and if an intruder happens to trip over one of them, the IDS will record this fact. But the fundamental problem with intrusion detection systems is that they can only record what they have been programmed to notice.

Network recording and logging tools take a different approach. Instead of having well-placed alarms, these systems record all of the traffic that passes over a network, allowing retrospective analysis. These systems are typically run on computers with large disks. (An 80-gigabyte hard disk, for example, can store nearly two weeks of typical traffic sent over a T1 line.) In the event of a break-in or other incident, the recorded traffic can be analyzed.

A variety of recording systems are now available, including:

  • NFR, by NFR Security (http://www.nfr.com/)

  • NetVCR, by NIKSU (http://www.niksun.com/)

  • Silent Runner, by Raytheon (http://www.silentrunner.com/)

  • NetIntercept, by Sandstorm Enterprises (http://www.sandstorm.net/)[12]

    [12] Simson Garfinkel and Gene Spafford are both founders of Sandstorm.

only for RuBoard - do not distribute or recompile


Web Security, Privacy & Commerce
Web Security, Privacy and Commerce, 2nd Edition
ISBN: 0596000456
EAN: 2147483647
Year: 2000
Pages: 194

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net