According to webopidia.com, clustering is defined as "two or more computers together in such a way that they behave like a single computer. Clustering is used for parallel processing, load balancing and fault tolerance." Clustering can apply to everything from application servers to databases to networking devices and more. In the world of Java, clustering is typically used to achieve stability and reliability in enterprise applications. For example, on the scalability front, this can mean HTTP session and component failover. Entire articles have been written on this subject, but let me provide some minimal guidelines here for clustering your Java applications:
|