Section 1.2. Patching


1.2. Patching

Correcting problems such as default or empty passwords poses problems when the vendor distributes the correction. As an example, for many years, Microsoft's SQL Server was distributed with an empty password on its administrator account.[12] This changed when the SQLSnake/Spida Worm exploited that empty password to acquire access to servers running that database. Microsoft issued a patch to update the server and add a password.

[12] CERT, "Microsoft SQL Server and Microsoft Data Engine (MSDE) ship with a null default password," CERT Vulnerability Note VU# 635463 (Aug. 10, 2000); http://www.kb.cert.org/vuls/id/635463.

A patch is an update to a program or system designed to enhance its functionality or to solve an existing problem. In the context of security, it is a mechanism used to fix a security problem by updating the system. The patch, embodied in a program or script, is placed on the system to be patched, and then executed. The execution causes the system to be updated.

Ideally, patching should never be necessary. Systems should be correct and secure when delivered. But in practice, even if such systems could be created, their deployment into various environments would mean that the systems would need to be changed to meet the needs of the specific environment in which they are used. So, patching will not go away. However, it should be minimal, and as invisible as possible. Specifically, the principle of psychological acceptability implies that patching systems should require little to no intervention by the system administrator or user.

Unfortunately, several considerations make invisible patching difficult.

The first difficulty is collecting all of the necessary patches . In a homogeneous network, only one vendor's patches need to be gathered, but a single vendor may offer a wide variety of systems. The patches for one system likely will not apply to another system. If the network has systems from many vendors, the problem of gathering and managing the patches is severe. Various tools such as Cro-Magnon[13] and management schemes using several tools[14] attempt to ameliorate this task. All require configuration and maintenance, and knowledgeable system administrators.

[13] Jeremy Bargin and Seth Taplin, "Cro-Magnon: A Patch Hunter-Gatherer," Proceedings of the 13th LISA Conference (Nov. 1999), 8794.

[14] David Ressman and John Valdés, "Use of Cfengine for Automated, Multi-Platform Software and Patch Distribution," Proceedings of the 14th LISA Conference (Dec. 2000), 207218.

The second difficulty is system-specific conflicts. When vendors write and test a patch, they do so for their current distribution. But customers tailor the systems to meet their needs. If the tailoring conflicts with the patch, the patch may inhibit the system from functioning correctly.

Two examples will demonstrate the problem. In the first example, a site runs a version of the Unix operating system that uses a nonstandard, but secure, mail server program. When a patch for that system is released, the system administrator updates the system programs, and then reinstalls them. This entire process is automated, so the system administrator runs two commands: one to update the source code for the system, and the other to compile and reinstall all changed programs. But, whenever the system's standard mail server is one of the programs patched, the system administrator must reinstall the nonstandard mail server. Because of the architecture of the updating process, this requires a separate set of commands. This violates the principle of psychological acceptability, because maintaining the security mechanism (the nonstandard mail server) is a visible process. The system administrator must be aware of the updating process, and check that the standard mail server is not updated or reinstalled.

The second example comes from the world of finance. Many large brokerage houses run their own financial software. As the brokerage houses write this software themselves, and use it throughout the world, they must ensure that nothing interferes with these programs. If the programs cease to function, the houses will lose large sums of money because they will not be able to trade on the stock markets or carry out their other financial functions. When a vendor sends them a security patch, the brokerage houses dare not install that patch on their most important production systems because the patch may interfere with their programs. The vendor does not have copies of these programs, and so has no way to test for interference. Instead, the houses install the patch on a test system or network, and determine for themselves if there is a conflict. Again, the process of maintaining a secure system should be invisible to the system administrators, but because of the nature of the system, transparency means a possible violation of the availability aspects of the site's security policy. The conflict seems irreconcilable.

This conflict is exacerbated by automatic downloading and installation of patches. On the surface, doing so makes the patching invisible. If there are no conflicts between the patch and the current configuration, this is true. But if there are conflicts, the user may find a system that does not function as expected, with no clear reason for the failure.

This happened with a recent patch for Microsoft's Windows XP system. Service Pack 2 provided many modifications to improve both system functionality and security. Therein lay the problem. Among the enhancements was the activation of Windows Firewall, which blocks certain connections from the Internet. This meant that many servers and clients, including IIS, some FTP clients, and many games, would not function correctly. After installing the patch, users then had to reset various actions of the firewall to allow these programs to function as they did before the patch was installed.[15] The principle of psychological acceptability disallows these problems.

[15] Microsoft Corp., "Some Programs Seem to Stop Working After You Install Windows XP Service Pack 2," Article ID 842242 (Sept. 28, 2004); http://support.microsoft.com/default.aspx?kbid=842242.

In the extreme, one patch may improve security, but disable necessary features. In this case, the user must decide between an effective security mechanism and a necessary functionality. Thus, the security mechanism is as obtrusive as possible, clearly violating the principle of psychological acceptability. The best example of this is another patch that Microsoft issued to fix a vulnerability in SQL Server. This patch eliminated the vulnerability exploited by the Slammer worm , but under certain conditions interfered with correct SQL Server operations.[16] A subsequent patch fixed the problem.[17]

[16] Microsoft Corp., "Elevation of Privilege in SQL Server Web Tasks (Q316333)," Microsoft Security Bulletin MS02-061 (Oct. 16, 2002); http://www.microsoft.com/technet/security/bulletin/MS02-061.mspx.

[17] Microsoft Corp., "FIX: Handle Leak Occurs in SQL Server When Service or Application Repeatedly Connects and Disconnects with Shared Memory Network Library," Article ID 317748 (Oct. 30, 2002); http://support.microsoft.com/default.aspx?scid=kb;en-us;317748.

The third difficulty with automating the patching process is understanding the trustworthiness of the source. If the patch comes from the vendor, and is digitally signed using the vendor's private key, then the contents of the patch are as trustworthy as the vendor is. But some vendors have distributed patches through less secure channels, such as USENET newsgroups or unsigned downloads. Some systems automatically check digital signatures on patches , but many others do notand faced with the choice, many users will not bother to check, either. Unverified or unverifiable patches may contain Trojan horses or other back doors designed to allow attackers entryan attack that was demonstrated when a repository of security-related programs was broken into, and the attackers replaced a security program designed to filter network connections with one that allowed attackers to gain administrator access to the system on which it was installed.[18] Sometimes even signatures cannot be trusted. An attacker tricked Verisign, Inc., into issuing two certificates used to authenticate installers and Active X components (but not for updating Windows) to someone claiming to be from Microsoft Corporation.[19] Although the certificates were cancelled as soon as the hoax was discovered, the attacker could have produced digitally signed fake patches during the interval of time that the newly issued certificates remained valid.

[18] CERT, "Trojan Horse Version of TCP Wrappers," CERT Advisory CA-1999-01 (Jan. 21, 1999); http://www.cert.org/advisories/CA-1999-01.html.

[19] CERT, "Unauthentic 'Microsoft Corporation' Certificates," CERT Advisory CA-2001-04 (March 22, 2001); http://www.cert.org/advisories/CA-2001-04.html.

Finally, the need to tailor techniques for patching to the level of the target audience is amply demonstrated by the problems that home users face. With most vendors, home users must go to the vendors' web sites to learn about patches, or subscribe to an automated patch notification system.[20], [21] Because users rarely take such proactive actions on their own, some vendors are automating the patching mechanisms in an attempt to make these mechanisms invisible to the user. As most home users reconfigure their systems very little, this effort to satisfy the principle of psychological acceptability may work well. However, the technology is really too new for us to draw any reliable conclusions.

[20] CERT, "Continuing Threats to Home Users," CERT Advisory CA-2001-20 (July 20, 2001); http://www.cert.org/advisories/CA-2001-20.html.

[21] CERT, "Home Network Security," CERT Tech Tips (June 22, 2001); http://www.cert.org/tech_tips/home_networks.html.



Security and Usability. Designing Secure Systems that People Can Use
Security and Usability: Designing Secure Systems That People Can Use
ISBN: 0596008279
EAN: 2147483647
Year: 2004
Pages: 295

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net