Security


Chapter 10, "P2P Security," discusses traditional network security in distributed systems and highlights new P2P issues.

Traditional Requirements

Traditionally, security is defined as the protection of information, systems, and services against manipulation, mistakes, and disasters. Network security is comprised of authentication, authorization, integrity, confidentiality, and nonrepudiation.

The Elements of Network Security

Authentication is the most common type of network security. It generally involves a user or process demonstrating some form of evidence to prove identity. Such evidence might be information only the user would likely know (a password), or it might be information only the user could produce (signed data using a private key).

Authorization involves the capability to enforce access controls upon an authenticated user. This is commonly implemented as an access control policy that provides an association between a user's access rights and system resources, such as databases, files, and processes.

Integrity ensures that messages are delivered correctly, and that messages in transit have not been tampered with maliciously.

Confidentiality and privacy ensure that data cannot be seen or disclosed over the network by outside parties. Encryption is a procedure used to convert text into code to prevent anyone but the intended recipient from reading that data. It can be used to ensure the confidentiality of data, the authentication of the data sender, or the integrity of the data sent.

Finally, nonrepudiation guarantees that a sender cannot deny having sent a particular message.

P2P requires the same level of security as traditional distributed systems. In addition, P2P highlights a number of security-related topics such as anonymity, trust, and accountability.

Anonymity

Anonymity relates to privacy. It ensures that someone can publish a document without having the system or an individual trace its origin. In other words, the author can remain anonymous. Anonymity can extend beyond the author to include the publisher, the reader, or the physical hardware and network supporting the system. Ensuring anonymity can be a difficult requirement to meet in distributed systems.

A number of techniques have been implemented to guarantee anonymity. Proxy servers or gateway servers can manipulate IP addresses to mask the real IP address of the publisher. An individual file can be split into multiple components and stored on multiple servers. Only encrypted keys can rebuild the original file. Keys are held by trusted members.

Trust

Trust is an important concept in computers. We often are more trustworthy on the Internet than we might like to believe. How often have you downloaded a file without checking the integrity or reliability of the source?

Trust implies confidence with the individual or system of interaction. You assume that malicious behavior will not occur with members or machines in the community. Of course, in the real world this is not always the case. Systems must be designed to ensure that less trustworthy components are identified and quickly removed. You cannot guarantee the content of a file or message exchange, but you must be able to identify and hold accountable some entity for disruptive or malicious behavior. As you can see, trust and anonymity can have conflicting goals.

To improve trust, downloaded files can be secured with message-digest functions. A message-digest function takes a variable length input message/file and produces fixed-length output. The same input will always produce the same output. Message digest functions are used to detect file tampering. This raises the level of trust that the downloaded file has not been altered.

Often digital signatures are used to identify the author of a file. An author can digitally sign a file to provide proof that the file indeed is from the author. Digital signatures involve encrypting information using two keys a private key and a public key. The author uses the private key to create a signature on the file. The public key is used by the receiver to verify the signed data is from the author.

Digital certificates improve our ability to trust information by providing a process to ensure the private/public key pair is legitimate and registered with a third-party authority. Digital certificates are issued by companies called certifying authorities (CAs). An individual or corporation must apply for a digital certificate with the proper credentials, usually requiring a fee.

Pretty Good Privacy (PGP) is software (www.mit.edu/network/pgp.html) that automates encrypting files and email using digital certificates. It enables certificates to be generated without CA involvement. Individuals can certify each others' certificates. Although this might not seem secure, the idea is that you can trust an individual that has signed for another individual. In effect, intermediaries begin to act as certifying authorities. If you trust the person who has certified a peer, then you trust the peer. It's kind of like transitive trust. This is actually being termed the "web-of-trust"in P2P systems.

Most P2P offerings to date that are PGP-based typically require all members of the community to share the same key. Shared keys in a P2P system create a security loophole, as when one member is removed from the community, all peers and all the content they share must be re-encrypted.

Accountability

Accountability refers to the concept of making users accountable for the resources they consume. For instance, most systems require a user to have an account, which in turn provides the user with access to certain resources and services offered by the system. The account provides the identification to track the use of resources. Of course, this is not a problem in centralized systems, in which all access goes through a central point of control. However, distributed P2P systems can present a number of challenges, and are more vulnerable to attack. For instance, without an identity service, user identification can be difficult if not impossible to implement. Network maps might only contain IP addresses, and these are often changing and transient for edge devices. In addition, many unknown intermediaries might be involved in a transaction or exchange, complicating most electronic payment systems.

One approach to accountability is the pessimistic model. A system based on this model minimizes the resources available, such as bandwidth, disk space, and message transfer to all participants in the P2P system. It is a simple risk-versus-reward model.

A second approach can be referred to as the optimistic model. This model assumes resource allocation is proportional to the degree of trust. The more trustworthy a member of the community, the more resources available for consumption. This model requires a reputation system, which collects history on the identification and usage patterns of a peer member.



JavaT P2P Unleashed
JavaT P2P Unleashed
ISBN: N/A
EAN: N/A
Year: 2002
Pages: 209

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net