Managing and Sharing Information


The next category of P2P services distributes the management and sharing of computer resources to a group of peers across the network. A number of subcategories are included:

  • File sharing

  • Resource sharing

  • Distributed search engines

File sharing applications such as Gnutella and Freenet form ad hoc P2P communities. These applications share files without requiring centralized coordination or control. Resource sharing applications are a form of distributed computing that use of the cumulative power of dynamically networked peers to tackle tasks previously possible only on supercomputers. For example, the SETI project referenced in the first chapter is one example of resource sharing.

Finally, distributed search engines are a P2P technology that address the problems inherent in the large size of the information space. Distributed search engines push the computing functions needed to build an index of search results toward the edge of the network where peers live. Distributed search engines use a divide-and-conquer strategy to locate information and perform these searches in real time.

Technology

While the P2P applications presented in this chapter go a long way toward explaining the definition of P2P, P2P file and resource sharing requires knowledge of access control, searching algorithms, metadata, and system performance techniques such as caching, clustering, and synchronization, which we'll examine next.

Access Control

Early P2P applications favored open access over security. Now that 30 million users are chatting away on the America Online network alone, a significant percentage are using chat at work. Controlling access to chat conversations is a significant risk to businesses that now depend on chat.

P2P file sharing applications are commonly found in businesses today, and pose a large risk. A typical P2P file sharing application involves four steps:

  1. Download and install the client.

  2. Launch the client.

  3. Designate a folder on your hard drive to share.

  4. Search and download files from the peer(s) hard drive.

As you can see, the potential for security breaches is enormous. First, can the downloaded application be trusted? What assurance do you have that once installed, the application is not scanning your hard drive and sending information to an undisclosed destination or that the program is not deleting files or otherwise performing a malicious activity? Unless the source is trustworthy, you have no assurance at all; and even then, malfunctioning programs can do damage unintentionally.

Despite the security hazards, P2P file sharing ranks number two in usage according to industry analysts. The reward of getting that hot MP3 file for free has outweighed security concerns for many individuals. However, for business adoption and the growth of distributed content networks (DCN), access control must improve. Improved network security will be required. Authentication and authorization services will be required to prove identity and associated permissions. Confidentiality and privacy will be required to ensure that data cannot be seen or disclosed over the network by outside parties. Message integrity is needed to protect data in transit.

Searching and Locating

P2P search engines are becoming more popular because they enable a search to be run in parallel using resources that heretofore have been untapped. Finding relevant information is more important and more difficult because of four important business trends:

  • Information hypergrowth Businesses continue to create huge amounts of information, and have turned to the Internet to serve as a global repository.

  • Information silos Businesses often create organizational, technical, or functional boundaries around their information, making it difficult to access openly.

  • Unstructured content Although EDI- and XML-defined content continues to grow, a tremendous number of ad hoc data structures are still used to store information.

  • The growing need to collaborate Internally and externally, businesses are becoming more interconnected. The need to collaborate continues to grow, and requires access to decentralized and relevant information.

A new breed of P2P applications is dealing with searching and locating information in new and innovative ways. Two important trends in computing are radically improving search capabilities in P2P applications:

  • Parallelism The capability to divide the search process into coordinated tasks.

  • Metadata Data that describes information to enable humans, and especially computers, to process that information more intelligently.

As a result, distributed searching has a number of benefits:

  • Increased efficiency Distributed searching eliminates the need to centralize and normalize the data.

  • Increased accuracy Content owners can continually update their information, even as others are searching it. Users see only the most current and accurate information.

  • Simple maintenance IT organizations don't have to maintain and update multiple sources of departmental information. The responsibility remains with the information owners.

Caching, Clusters, and Synchronization

The origins of P2P technology go back to a time when the scale of potential users of a software application grew from hundreds to thousands. P2P technology distributed the computing power of an application to the peers to avoid scalability bottlenecks imposed by a client/server architecture. Although P2P architecture worked for the jump to thousands of users, many P2P technologies today enable millions of users. Scaling up to millions of users required innovations in caching, cluster, and synchronization technology.

P2P caching technology largely drew from caching techniques developed for client/server systems, including the following:

  • Replication All peers eventually get a copy of every change in a P2P network. Each peer is programmed to forward a copy of a data change in a token. The token gives each new peer a unique identifier to the data and a validity date. The peer records the changed data if the validity date is more recent than the data it holds. Replication works best for applications that are not tied to a certain state. For example, whereas real estate listings don't change frequently and are not required to always be accurate, airline reservation systems would likely not work well in replicated systems.

  • First-In First-Out (FIFO) Each peer in a network is part of a hierarchy of peers that share data updates. New data filters into the P2P network and replaces existing data. The downside to FIFO is the overhead needed to coordinate the hierarchy of peers.

  • Hybrid centralized/peer Each peer holds its own copy of data, but looks to a centralized database to identify when data updates are available. The peer then finds the updates in other peers.

  • Lossy data techniques Each peer holds it own copy of data and shares data updates with the peers it immediately knows about. The downside to this is that data might not make it to all nodes in the P2P network.

Blindly implementing P2P architectures will not improve the efficiency of your network. Caching must be used to ensure that data is not accessed unnecessarily. The network must be aware of information location to avoid redundant network transport, or movement of data. However, highly volatile data has the overhead of synchronization to ensure that only the most recent information is available to users and applications, as stated previously. Intelligent caching can be difficult when used with highly volatile data.

Clustering provides a mechanism to increase the throughput and availability of computing resources. By linking peers to form clusters, you can build more scalable solutions and redundancy into your network and applications. P2P clusters can form dynamically. This can minimize the amount of human intervention and administration required to operate and maintain networks.

All these techniques imply more intelligent software software that is more "aware" of its environment.

Products

Like instant messaging, the popularity of file and resource sharing has created a large market for products.

Gnutella

As mentioned in Chapter 1, "What Is P2P?," Gnutella is a popular file sharing and searching protocol that operates with edge devices in a decentralized environment. In the Gnutella network, each peer acts as a point of rendezvous and a router, using TCP for message transport, and HTTP for file transfer.

Searches on the network are propagated by a peer to all its known peer neighbors who propagate the query to other peers. Content on the Gnutella network is advertised through an IP address, a port number, an index number identifying the file on the host peer, and file details such as name and size. Additional information on the current status of Gnutella can be found at http://www.gnutellanews.com/.

The Gnutella client LimeWire, (http://www.limewire.com/) is a software package that enables individuals to search for and share computer files over the Internet. LimeWire is compatible with the Gnutella file sharing protocol, and can connect with any peer running Gnutella-compatible software.

At startup, the LimeWire client connects via the Internet to the LimeWire gateway. The LimeWire gateway is an intelligent Gnutella router. LimeWire is written in Java, and will run on any machine with an Internet connection and the capability to run Java.

In the sample search request seen in Figure 3.7, a request for Eric Clapton audio files was submitted through the LimeWire client. The network responded with 178 matches! The application displays the type of file, the peer connection speed, the size of the file, and the IP address of the download location. You can select the MP3 file, download it to your local hard drive, and launch the included MP3 player.

Figure 3.7. LimeWire is a Gnutella client that promotes distributed searching and file sharing.

graphics/03fig07.jpg

NextPage

NextPage bridges the consumer-oriented Internet architectures with corporate intranets and extranets. The company is adopting a P2P strategy with its server-oriented architecture. NextPage's NXT-3 Content Networking platform enables users to "manage, access, and exchange content across distributed servers on intranets and via the Internet." The platform indexes and connects content across organizational boundaries, allowing you to search and locate content from multiple locations without having to know where it physically resides.

NextPage offers an extensive array of search functions including keyword, Boolean, phrase, ranked, wildcard, and so on. For more information, go to www.nextpage.com.



JavaT P2P Unleashed
JavaT P2P Unleashed
ISBN: N/A
EAN: N/A
Year: 2002
Pages: 209

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net