4.5 Common Encryption Algorithms


4.5 Common Encryption Algorithms

This section describes some of the most common encryption and hashing algorithms that you will come across in evaluating security products. While it is not extensive, it does provide a good basis of comparison for discussing and comparing features.

4.5.1 Symmetric/Secret Key Algorithms

Algorithms in this category are used in the actual encryption of bulk data, whether it is data stored on a hard drive or data sent over a VPN. They all have in common fairly short key lengths compared to public key algorithms, but for most applications, the keys should be secure for years to come.

4.5.1.1 Data Encryption Standard (DES).

One of the first public key algorithms available for general use, DES was a derivative of an encryption algorithm that IBM had been circulating known as Lucifer. DES is a block cipher, meaning that it breaks up the data into blocks of 64 bits and then encrypts each block separately. DES can operate in several modes, with each mode treating the blocks of data to be encrypted differently. For example, the cipher block-chaining (CBC) mode of DES creates a master key that is applied to the first block of data. The key itself is modified in a defined way and then applied to the second block of data. In a sense, each prior block serves as a randomizing agent for the key that is used on the next block of encryption. This method of operation makes it more difficult for someone to analyze the ciphertext and look for patterns that could reveal the key. It also means, however, that if a single block of data is corrupted, the rest of the data cannot be recovered, even if the initial key is known.

In contrast, the electronic code book (ECB) mode of DES operation treats each block of 64 bits as a separate entity. Thus, block 2 is not dependent on block 1 for keying material. A lost block will not render the rest of the data unrecoverable. This is the fastest form of DES; but because it makes it easier for someone to capture and analyze ciphertext, it is considered the least secure.

DES uses keys that are 64 bits long; but because eight of those bits are only parity bits and used to ensure that the key itself does not contain undiscovered errors, the effective length of the key is only 56 bits. While DES is fast, it has been broken using commercial-grade computers in a "reasonable" period of time that can be as short as three days. As computing power increased, it became clear that DES was less secure than it once was. To respond to this vulnerability, a new encryption algorithm known as Triple DES (3DES) was developed.

Despite having been cracked through a brute-force attack, DES is still found quite often in implementations. Manufacturers in the United States are prohibited from exporting strong encryption technology and thus, by default, do not include 3DES in all products. DES, on the other hand, can be considered the common denominator of encryption. It is widespread and can be found in all cryptographic implementations.

4.5.1.2 Triple Data Encryption Standard (3DES).

Due to DES having lost some of its strength as computers have become more powerful, there was a movement afoot to find stronger encryption algorithms. In this regard, there were two lines of thought. The first was to simply create new algorithms. The other was to use the intellectual investment in DES to create a stronger version of DES. The result of that second effort was Triple DES. As the name suggests, 3DES is simply DES applied three times with three keys to a single block of data. Any increase in processing time was offset by the greatly increased processing power in our encryption devices since the first implementation of DES. In this case, the effective key length of 3DES is 168 bits (56 bits 3). Like DES, 3DES is very widespread and any device claiming to support "strong" encryption will most likely support 3DES as the default option. Because it does classify as strong encryption under U.S. law, U.S. companies are not automatically allowed to export this technology. In some cases, products are sold with DES encryption only and 3DES must be added on after completing a short form promising that you will not export this technology to enemies of the United States.

4.5.1.3 International Data Encryption Algorithm (IDEA).

3DES was not the only suggested replacement for the DES algorithm. As can be imagined, there were any number of candidates vying for this position. IDEA is one of the more popular of the suggested replacements for DES. Like DES, IDEA encrypts data in block lengths of 64 bits. IDEA, however, uses 128-bit keys. IDEA also employs techniques to "confuse and diffuse" encrypted information. While the implementation of these techniques is interesting in of itself, it also provides a good background to introduce some of the problems that cryptographers run into when creating an encryption algorithm.

There are essentially three ways to break an encryption algorithm. The first is to simply try all the possible keys. This is countered by making the key space sufficiently large. The second possibility is to discover some sort of flaw in the encryption algorithm itself that allows it to be calculated more easily than through use of a brute-force attack. This is why peerreviewed algorithms are always much more secure than "secret" encryption algorithms. If you have made a mistake in your new "SuperSecure" algorithm, someone will point it out and allow you to fix it. Finally, the third possibility is that an attacker can attempt to determine the key used to encrypt data by performing a cryptanalysis of the ciphertext.

There are several different forms of cryptanalysis. Some attempt to look for repeating patterns in the encrypted information. For example, if you were encrypting TCP information, you would at least know the organization of the headers and could take a good guess at the information in the headers. Other attacks have the cryptanalyst encrypt a known piece of plaintext and view the encrypted output. Knowing what to look for may provide clues as to how to decipher everything else that has been encrypted. There are more variations but the key to all of them depends on the statistical relationship between the ciphertext and the plaintext. That relationship of course is the key.

To prevent this, the IDEA cipher attempts to "confuse" by using three separate encryption mechanisms on the same data. The goal is to scramble the original data enough so that it is difficult to even determine what the original was. IDEA attempts to "diffuse" by ensuring that each bit of ciphertext is influenced by the preceding plaintext. Thus, a single bit of plaintext influences more than just a single bit of ciphertext in the same way that the previous block of DES would influence the encryption of the following blocks. Any structure to the plaintext that would aid a cryptanalyst would hopefully be eliminated through this mechanism.

IDEA is a popular algorithm for no other reason than it was included in the PGP product that many use to encrypt e-mail correspondence and personal files. While generally supported, it is not as widespread as DES and 3DES. In other words, you cannot count on finding it as part of an encryption solution.

4.5.1.4 Blowfish.

You will not be a student of cryptography or information security for long before the works of this algorithm's author become very familiar. Bruce Schneier wrote Blowfish with the intent of creating a secure, compact, and fast encryption algorithm. Due to its fairly small code base, Blowfish is easy to review and to this point has not been compromised. Blowfish has several characteristics going for it. To start, the goal of creating Blowfish to be small and fast seems to have been successful. Blow-fish can run in only 5K of memory and operates with few clock cycles on 32-bit processors, thus making it ideal for software implementations. To future-proof the algorithm, Blowfish also supports a variable-length key of up to 448 bits.

While most hardware encryption chips use either DES or 3DES, Blowfish is ideal for software-based encryption implementations. SSH as an example is an encrypted version of the Telnet protocol. This often is configured with Blowfish because it is an application that provides its own encryption services. I know of more than one software-based VPN that uses the Linux FreeS/WAN VPN server to create an efficient and powerful VPN based on the Blowfish algorithm.

4.5.1.5 Advanced Encryption Standard (AES).

When the U.S. Government decided that it was time for a new encryption algorithm to be used by government facilities, a contest was held. From four finalists, the winning algorithm, known as "Rijndael," [5] was selected. In determining the winning algorithm, the National Institute for Standards and Technology (NIST) judged finalists on their suitability for both hardware and software implementations, memory requirements, performance, and, of course, security. AES is a block cipher as are most symmetric algorithms in common use; but unlike the algorithms discussed thus far, AES operates on blocks of 128 bits. To accommodate varying security/performance considerations, AES offers key lengths of 128, 192, and 256 bits in strength.

Due to its acceptance by the U.S. Government as part of its Federal Information Processing Standards (FIPS), AES is guaranteed to be a very popular encryption algorithm. Many non-government and non-U.S. governments will also adopt AES, simply due to the review of the algorithm by the NIST.

When considering a new VPN implementation based on hardware encryption, AES is generally considered the go-forward algorithm of choice. Care must be exercised, however, because the relative newness of AES means that support cannot be assured in all encryption products. Support considerations are especially relevant if integration with legacy equipment must be considered.

4.5.2 Asymmetric/Public Key Algorithms

Unlike the symmetric key algorithms, asymmetric algorithms use a considerably larger key space. As described above, their primary purpose is for authentication and the encryption of symmetric keys.

4.5.2.1 Diffie-Hellman.

It would not be a stretch to say that the field of cryptography and the Internet boom that followed were based on the principle of secure communication and would have otherwise languished were it not for the work of the researchers Whitfield Diffie and Martin Hellman.

Consider this scenario. You and I are in a crowded room and we wish to begin speaking in code. To do this we have to verbally agree on the key that we will use to encode and decode information. Because we are limited to verbal communication, however, there is the chance that anyone in the room could hear our key exchange and thus know our code. What we need is some way to exchange keys while everyone listens.

The algorithm discovered by Diffie and Hellman accomplishes just this feat. Using algebraic equations that your average ninth grader could understand, Diffie-Hellman provides a way to exchange keys while assuming that anyone else may be listening.

You do not need to know how this is accomplished to make a decision regarding Diffie-Hellman as a countermeasure; but because it is so simple, so elegant, and so important to the science of cryptography, it is worth briefly examining the protocol.

The operation of the protocol is based on modular mathematics. Anyone who remembers learning division and learning that 16/6 equals 2 with a remainder of 4 knows what modulo arithmetic is. In this case, the remainder is the modulus. So 16 = 4 mod 6 is an equivalent statement. It essentially says that 6 goes into 16 X number of times with a remainder of 4. What X is, is irrelevant to the solving of the problem. The interesting thing about the modulus is that there are an infinite number of problems that will all return an answer of 4, so knowing this is not going to help you figure out what numbers were used to generate it. For example, 29 = 4 mod 5 and 6148 = 4 mod 1024 as well.

Essentially, the two computers are sending a remainder to each other and deducing the same secret key from this. To use this technique to create a secret key, two computers will publicly agree on a value p and a value q. There are some rules to the values of these numbers. For example, p needs to be a very large prime number and q must be a value lower than that of p. One of the two computers will then privately choose a value x and the other will privately choose a value y. The only requirement for x and y is that they are less than p. With the private number in hand, the two computers will then compute an equation based upon the p, q, and the x or y value, respectively. They will then send the answer to that equation to each other over a public network. From the answer, they will then create a shared secret key. Although everyone may find out what the answer to the equation is and know the p and q values, they will have a very difficult time (read, thousands of years) figuring out what the secret key is because only the computer that generated either the x or y value will have the information needed to solve this problem. Because these values are never shared, the secret keys are secure. Remember: with modular arithmetic, there are many different ways to come up with the same remainder.

Exhibit 1 shows this operation using sample numbers. To keep the problem simple, I am going to use really small numbers. If a computer was doing this, rather than myself with a calculator, the numbers would be 768, 1024, or 1536 bits long, on average. Those are really big numbers. In modern implementations, Diffie-Hellman is widespread. It is commonly used in key exchanges and for transferring authentication information. This is important to note. Diffie-Hellman alone has no mechanism for authenticating the remote host and because of this, the protocol is susceptible to man-in-the-middle-style attacks if an attacker can fool a user into relaying his connection through the attacker's computer. The user creates an encrypted channel to the attacker and the attacker creates a connection to the remote host on behalf of the user. Everything that the user sends is then decrypted, read, and then retransmitted by the attacker in the middle of the connection.

Exhibit 1: A "Simple" Diffie-Hellman Example

start example

click to expand

end example

While Diffie-Hellman forms the basis of asymmetric cryptography, it is commonly only implemented in Diffie-Hellman situations that have other protocols available for authentication. The most well-known use of Diffie-Hellman is in the use of the IKE key exchange protocol used to set up IPSec sessions. Authentication is provided in a number of ways, all discussed in the IPSec section of this text. Suffice it to say that Diffie-Hellman is used to encrypt the information used for authentication. Another popular use is in the SSL protocol. This is the protocol used to secure Web sessions and other transport layer connections between hosts. In this case, Diffie-Hellman is used to encrypt the transfer or RSA or DSS signatures to verify the remote party's identity.

As noted, common Diffie-Hellman exchanges are based on 768-, 1024-, or 1536-bit keys. Longer keys are, of course, more secure, but also consume more processing power. Because most Diffie-Hellman exchanges are only used for short authentication sessions, their duration is short and the key strength is currently adequate, given the computing power likely to be brought against the algorithm.

4.5.2.2 Rivest-Shamir-Adleman (RSA).

Building on the possibilities of the Diffie-Hellman algorithm, researchers Ron Rivest, Adi Shamir, and Len Adleman created the RSA algorithm that has since become the most commonly implemented generic approach to public key encryption. Because it is so common, the "generic" description of public key applications that I described above is also the generic description of the RSA algorithm and its applications.

Some of the biggest news concerning RSA has been the licensing of the algorithm. As the de facto standard for public key encryption, RSA was entitled to charge licensing fees to the thousands of companies that used their technology. In 2000, a week earlier than expected, the RSA algorithm was released to the world in the public domain. Some felt that the control that RSA had over data encryption slowed down innovation and adoption of encryption technology. Others noted that the restrictions placed upon RSA actually spurred the creation of new technologies that could be more widely implemented.

Regardless, RSA has been around since 1978 and continues to be a widely implemented and trusted encryption algorithm for public key applications. More likely than not, your public key solutions are going to include RSA in some way.

Like Diffie-Hellman, the actual math behind RSA is not that complicated. The key to the security of the algorithm is the difficulty in factoring large numbers. Two large prime numbers represented as p and q are chosen. They are multiplied together to create another number represented, as n . "n " then becomes the modulus that is transported as part of the key. Recall from our discussion of Diffie-Hellman that knowing the mod value does not really help you find out what equation was used to create the modulus.

From n we also create two values, e and d. E becomes our public key and d becomes the private key. If you wanted to send something using RSA encryption to me, I would send you both e (the public key) and n. You would use those values to encrypt information to me. Because you do not know what d value I have, you cannot decrypt any information that has been encrypted only with e (the public key).

Of course, saying that the algorithm is not that complicated and then throwing letters out like that does not make the argument very convincing. Suffice it to say that these cryptographic keys are nothing more than some very big numbers with certain characteristics.

Due in part to its popularity, RSA has been the focus of several dedicated attempts at cracking the encryption. Unlike secret key encryption, the most straightforward way of cracking RSA is to attempt to factor the prime numbers that have generated the keys. Because we know that RSA is not created from random numbers, but from random prime numbers, it narrows the search quite a bit. The keys are still very difficult to break, but given enough time and computing power, the prime numbers that have created our keys can be determined. Such attacks generally require specialized algorithms and distributed processing. Because the algorithms are constantly being tweaked and improved, and as computer power becomes cheaper and more common, the strength of RSA keys is slowly eroding.

This is not to suggest that RSA is "insecure." Although researchers have published ways to quickly factor 1024-bit keys, these solutions are still unproven. They also cost more than most companies (but not governments) would be willing to spend, at around $1 billon per "RSA breaker." Even at this rate, 2048- and 4096-bit keys are quite secure for the foreseeable future. Weaker keys should be avoided in implementation.

[5]Pronounced alternately as "rain doll" or "Reign Dahl."




Network Perimeter Security. Building Defense In-Depth
Network Perimeter Security: Building Defense In-Depth
ISBN: 0849316286
EAN: 2147483647
Year: 2004
Pages: 119
Authors: Cliff Riggs

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net