Technological Background


Threats Analysis, Security Mechanisms and Security Services

Every security related activity starts with threats analysis. Although threats analysis may vary from one specific environment to another, the basic approach is as follows (Raepple, 2001). Threats are first identified and the probability of successful realization of the identified threat determined. Afterwards, expected damage is calculated. This is the basis for setting priorities for countermeasures. Investment in counter-measures should certainly not exceed damage costs.

From a technological point of view, the prevention of threats forms security mechanisms and security services (ISO, 1995a). Mechanisms include symmetric algorithms, e.g., AES (Foti, 2001), asymmetric cryptographic algorithms, e.g., RSA (RSA Labs, 2002), one-way hash functions, e.g., SHA-1 (Eastlake, 2001), and physical mechanisms. For devices with weak processing capabilities like smart-cards elliptic curve based systems should be mentioned, e.g., ECDSA (ANSI, 1998). The advantage of these systems is that they require shorter keys than ordinary asymmetric algorithms for a comparable strength of cryptographic transformation.

The same key is used for encryption and decryption with symmetric algorithms, while asymmetric algorithms use one key for encryption and another for decryption. The first key is called a private key, and the second, which can be communicated to anyone, is called a public key. This is a very desirable property that is the basis for digital signature—anyone in possession of a corresponding public key can decrypt a message that has been encrypted with a private key, which assures the origin and integrity of this message. But there are drawbacks. In comparison to symmetric algorithms, asymmetric algorithms are computationally more complex. Next, to ensure that a particular public key indeed belongs to a claimed person, a trusted third party called certification authority (CA) has to be introduced. CA issues a certificate that is a digitally signed electronic document, which binds an entity to the corresponding public key (certificates can be verified by CA's public key). CA also maintains certificate revocation lists (CRL) that should be checked every time a certificate is processed, in order to assure that a private/ public key is still valid. One possible reason for invalid keys is growing processing power of computing devices, which prompts the need for ever-longer keys. Further, private keys may become compromised, and finally, a user may be using a certificate in an unacceptable way. And this is the point where public key infrastructure comes in.

Regarding digital signatures, one should bear in mind that they are actually produced by the use of one-way hash functions, which process a text of arbitrary length to an output of fixed length. One-way hash functions produce a fingerprint of a document and these fingerprints are used for digital signatures: A document is hashed and its hashed value is encrypted with a private key—this actually presents its signature. The recipient produces a hashed value of a received document and decrypts the signature with the public key. If those values match, the document is successfully verified.

Protocols that use cryptographic primitives are cryptographic protocols. They are used to implement security services, which are:

  • Authentication, that assures that the peer communicating entity is the one claimed.

  • Confidentiality, that prevents unauthorized disclosure of data.

  • Integrity, which ensures that any modification, insertion or deletion of data is detected.

  • Access control, that enables authorized use of resources.

  • Non-repudiation, that provides proof of origin and proof of delivery, where false enying of the message content is prevented.

  • Auditing, that enables detection of suspicious activities, analysis of successful breaches and serves for evidence, when resolving legal disputes.

Security Infrastructure

Except auditing, security services are implemented with cryptographic protocols. To provide authentication in a global network, asymmetric algorithms are used because of their low key-management complexity. To compensate for their computational complexity, symmetric algorithms are used for session transfers, once entities have been authenticated. Exchange of session keys is done using an asymmetric algorithm at the authentication phase.

To enable the above-described basic procedures for digital signatures and the establishment of secure sessions, a so-called public key infrastructure has to be set up. Besides CAs, a directory is needed for distributing certificates and CRLs—an example of such a directory is the X.500 directory (ITU, 1997). The so-called Registration Authority (RA) that serves as an interface between a user and CA identifies users and submits certificate requests to CA. In addition, a synchronized time base system is needed for proper operation. All these elements, together with appropriate procedures, form a public key infrastructure (PKI).

The main specification of a certificate and certificate revocation list is X.509 standard version 3 (ITU, 2000). Basic certificate fields are serial number, issuer (trusted third party), the subject that is an owner of a public key, the public key itself, validity and signature of a certificate. Other fields are required for processing instructions, while extensions are needed to support other important issues, which are yet to be resolved: automatic CRL retrieval, their placement and distribution, security policy issues, etc. One should note that before using a public key, the certificate always has to be checked against corresponding CRL (or CRLs).

A procedure for initial key exchange within the web environment goes as follows. When contacting RA, a user signs a request and is identified on the basis of a valid formal document. A user, who has enrolled at RA, is sent two secret strings through two different channels, e.g. e-mail and ordinary mail. After obtaining these strings, a user connects to the CA's server that supports SSL protocol (Freier, 1996) and has installed CA's certificate. By connecting to this server through the browser, SSL protocol is automatically activated and a secure session is established. Based on the data about CA' s certificate a user can be assured of being connected to the appropriate server—usually this is done by checking key fields and a fingerprint of a certificate, which is obtained at RA during initial registration. Confidential exchange of subsequent data along with integrity is then enabled. This starts with the user entering his/her personal data and secret sequence strings, which authenticate the user to the server. Next, a server triggers a browser to produce a key pair and a public key is transmitted over the network for signing. When the certificate is produced, a user can download it to his/her computer, as every certificate is a public document. Regarding its revocation, the most straightforward procedure goes as follows: a user makes a request for revocation with the serial number of a certificate and signs it with a compromised private key.

PKI efforts date back to the late eighties. Many standards now exist and the main introductory reading from the technological point of view can be found in Aresenault et al. (2002).

However, there are still many open issues (Gutmann, 2002). There is no support for automatic certificate distribution and no support of automatic revocation checks. Further, there are problems with atomicity of certificate revocation transactions and problems with frequent issuing of CRLs. Finally, problems include costs associated with the distribution of CRLs, problems with finding certification chains, i.e. determining appropriate sequence of certificates in a non-centralized environment.

Additional Elements of Security Infrastructure—Commercial off the Shelf Solutions

Security infrastructure is not limited only to PKI, which is the basis, but includes also other systems that are mainly available as commercial off the shelf solutions:

  • Firewalls. These are specialized computer systems that operate on the border between the corporate's network and the Internet, where all traffic must pass through these systems (Cheswick & Bellovin, 1994). Authorized traffic that is allowed to pass is defined by the security policy. Firewalls block vulnerable services and provide additional useful features like local network hiding through address translation, which prevents attackers from obtaining appropriate data for successful attacks. Firewalls can also provide proxies that receive and preprocess requests before passing them on and can run exposed servers like HTTP daemons. Firewalls are not able to prevent attacks that may be tunneled to applications, e.g., virus attacks, Trojan horses, and bypassing of internal modem pools (Stallings, 1999).

  • Real-time intrusion detection systems (RIDS). Similarly to firewalls, RIDS present a mature technology that has been around for almost a decade (Kemmerer & Vigna, 2002). Their operation requires reliable and complete data about the system activity, i.e., what data to log and where to get it. Two kinds of RDIS exist—the first is based on anomaly detection, and the second on misuse detection. Anomaly detection means detecting acts that differ from normal, known patterns of operation, where the advantage is that it is possible to detect previously unknown attacks. However, such systems produce high false-positive alarms. On the other hand, use of definitions of wrong behavior is the basis for detecting intrusions. Audited data is compared with these definitions, and when matched, alarm is generated. The benefit of these systems is the low rate of false positive alarms, at the price of detecting only known attacks.

  • IPSec. Ordinary IP protocol, i.e., version 4, is known to be vulnerable in many ways. It is possible to play a masquerade, to monitor a communication, to modify data and to take over a session. IPSec (Thayer, 1998) provides security within the IP layer by use of authentication, confidentiality, connectionless integrity, and access control. Additionally, limited traffic flow confidentiality is possible. A frequently used concept in the business area is Virtual Private Network (VPN). It presents a cost effective solution for building secure private networks using public networks such as the Internet. With security services provided by IPSec, physical links of an arbitrary provider can be used to transfer the protected organisation's data, thus logically implementing a private network. Devices, users and applications are authenticated by use of certificates and appropriate cryptographic protocols. Similarly, cryptographic protocols are used for secure session key exchange, so that confidentiality and integrity can be assured.

  • Secure Sockets Layer (SSL). This protocol, developed by Netscape, presents a common security layer for Web and other applications, and is available by default in Web browsers. It provides authentication, confidentiality and integrity with the possibility of negotiating crypto primitives and encryption keys. When establishing a secure connection, only a server is authenticated by default, while client authentication is optional. Every session is initiated by a client. A server sends, in response, its certificate and cryptographic preferences. The client then generates master key, encrypts it with the server's public key and returns it to the server. Using its private key, the server recovers the master key and sends to the client a message encrypted with this master key. This basic phase can be optionally extended with client authentication, which is analogous to the basic phase with roles of client and server exchanged. After authentication of involved parties, subsequent messages are encrypted with a symmetric algorithm that uses session keys derived from a master key to provide confidentiality. TLS (Dierks, 1999) is a successor of SSL and it is not compatible with SSL.

  • Secure/Multipurpose Internet Mail Extensions (S/MIME). These are security enhancements for ordinary e-mail (Ramsdell, 1999), which has been designed to transfer only printable characters. In order to send binary data, these data have to be re-coded into a format that is understood by the majority of mailing systems. This transformation is defined by MIME standards (Freed, 1996). S/MIME is a security enhancement of MIME that uses X.509 certificates to provide authentication, confidentiality, integrity and non-repudiation.

  • Extensible Mark-up Language (XML). XML is becoming the de facto standard for the definition and processing of electronic business documents (Harold & Means, 2001). Its main business application is electronic data interchange (EDI), where mature standards based on older technologies were defined by ANSI ASC X.12 (ANSI, 2001) or EDIFact (UN, 1993). These standards were not very flexible and, additionally, they did not address security issues (these were left to the transport system). It is anticipated that the introduction of XML will enable further flexibility and security to EDI transactions. XML is a meta-language that consists of three parts. The first part covers basic XML documents, with user-defined tags in a human readable form, which are used by subsequent programs as processing instructions. The second covers proper structuring of XML documents, where data type definition files and schemas are used, which actually define the syntax of a document. The third is intended for presentation of a document, where so called cascading style sheets and extensible style sheets are defined. XML security standardization efforts are concentrated on possibilities to encrypt and sign only portions of documents that would still enable automatic procedures in subsequent document processing.

Security Issues of New Paradigms in IT

New paradigms include objects, components, mobile code (computing) and intelligent agents. These are all based on recent trends in software development, i.e., object oriented design, implementation, and network awareness (network awareness means that the code has to be highly integrated into the network environment and react accordingly).

Language Java has been designed in line with the above requirements and is becoming the de facto standard for programming modern, network aware applications. Java is based on an objects paradigm, where objects are self-contained pieces of software code with their own data and methods. Nowadays objects are usually grouped into components that are independent modules, which can be inserted into, or removed from, an application without requiring other changes in the application.

Nevertheless, objects present generic elements, therefore security issues have to start with proper treatment of objects. Every code (and object) can be treated as an electronic document. The creator defines its initial data and behavior (methods) and optionally signs it. The signature on the code gives a user a possibility to be assured of proper functioning of this object. The problem is analogous to the problem of assuring authentication and integrity for ordinary electronic documents. The mainstream in the development of software systems goes in the direction where objects/components will be available over the network for installation and execution at local (or remote) premises. To ensure security for a local environment, which means protection from malicious code, these objects have to be signed by the producer and, before being deployed, signatures must be checked. If the source of objects (components) is trusted, the code can be executed or installed.

An important new paradigm in IT is the proliferation of mobile computing, which has significant implications for the business environment. One should bear in mind that we are reaching the point, where the number of wireless devices will exceed the number of fixed nodes. This poses new requirements on security, as handheld mobile devices have limited processing power and memory capacities. Due to the invention of elliptic curve cryptography it is possible to provide strong cryptographic mechanisms for these devices. However, the problem for the wireless world is PKI. Besides open issues already mentioned, PKI in the wireless world requires extensive computation for certificates and CRLs and further narrows the available throughput. Appropriate standards that would enable a wide-scale secure deployment are yet to come (Miller, 2001).

A fundamentally different approach in the contemporary computing environment presents mobile code and especially, mobile intelligent agents—besides mobility, these codes express autonomy, adaptability, mobility, intelligence, capability of cooperation and persistence (Griss, 2001). Agents are passed from one processing environment to another, where they use computing resources of the host. Intelligent agents act on behalf of their users to find the best offers, bid at auctions, etc. Therefore, their security is of utmost importance. Fundamental threats include uncontrolled read and write access to core agent services, privacy and integrity of their messages and denial of service problems. The reason is that agents operate in unpredictable environments and have to be protected from malicious hosts. Put another way, mobile agents are vulnerable to code peeping and code modification through false computation. These important issues are yet to be resolved (FIPA, 2001).




Intelligent Enterprises of the 21st Century
Intelligent Enterprises of the 21st Century
ISBN: 1591401607
EAN: 2147483647
Year: 2003
Pages: 195

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net