In this chapter you learned about various topics
You configured Tomcat to work as a stand-alone Web server and as a servlet/JSP engine for Apache. You could configure Apache to use the
connector to communicate with Tomcat. You then examined some common security enhancements for using virtual
Before you can confidently move your test server into production, meaning that it will be completely
An automated server load test simulates client
Server load testing tests the scalability of the server and thus the ability of the system to handle an increased load without degradation of performance or reliability. Scalability is how well a solution to a problem will work when the
You need to make several decisions when setting up and configuring Tomcat that will affect the scalability of your installation.
The JVM sets its own memory usage, but you can configure the limits that it uses at the command line. These settings alter the JVM’s heap, which is where object instances are stored.
You should remember two very important switches when you set up a Tomcat instance.
-Xms < size > : The initial heap size for the JVM
If you don’t explicitly set these parameters, the JVM will use its defaults, which are a minimum of 2MB and a maximum of 64MB for JDK 1.4 and JDK 5.
Maximum heap size is the upper limit of RAM that the JVM will allocate to the heap. To set the maximum heap size to 256MB, use the following switch:
To specify memory size in GB, use the letter g instead of m .
In a data-
You use the initial heap size setting to allocate memory to the Java heap at JVM startup. In a memory-intensive, heavily loaded application, initial heap size can be important; if the JVM starts with a very small heap size and it receives a large number
Something to bear in mind when doing this is that setting the heap size to a value that’s as large as your server will allow isn’t always a good idea. This may cause
One possible solution to this problem is to pass the following command-line option to the Java executable:
This forces the garbage collector to run in incremental mode, meaning it runs more often but checks through smaller amounts of memory. You should monitor this
Lowering the size of the heap may also help this situation, as would a combination of both these techniques. These are prime examples of why you should load test your server before it goes into production. Otherwise, you wouldn’t know which of these settings was most appropriate for the Web applications on your server.
Several connector parameters may affect your server’s performance. The following are the performance-critical attributes of the < Connector > element. For an exhaustive discussion of these elements, see Chapter 9.
sets the number of connections that the server will accept while waiting for a free processor. Incoming connections over this limit will be
attribute imposes a limit on the number of threads the server will start, regardless of the server load. If the server receives more simultaneous requests for a given connection than the value of this setting, the requests will block until a thread is freed to handle them. If this number is set too high, heavily loaded sites run the risk of a performance
You can monitor thread count with operating system–specific tools, such as ps in Unix-like systems. If the number of threads approaches the maxProcessors setting, followed by a server performance slowdown, you should increase this setting and repeat the experiment. The default is 20 .
A processor is a thread that handles requests for a connector on a given port. Setting the
attribute too high can produce a large number of unnecessary threads, which will put an extra
As with the
attribute, you can monitor the thread count with operating system–specific tools. If you see the number of threads increasing
Tomcat’s default session manager is very fast because it stores its data in memory, as discussed in Chapter 7. This implies a trade-off between speed and the memory consumption on the server. However, the problem when working with sessions is that they’re configured at the application level in web.xml , using the < session-timeout > subelement of the < session-config > element. This means developers are in charge of them in the beginning and may have their own reasons for configuring them they way they are.
You must weigh the needs of your server against the needs of the developer’s application and its users. Ultimately, you have responsibility for the application once it’s deployed on your server, so you have the means and the authority to change the session settings as appropriate.
In extreme cases, such as
The simplest Tomcat setup with a single stand-alone Tomcat server using an HTTP connector is usually appropriate for very small installations. However, as load
It’s possible, under certain conditions, for the JVM to become a bottleneck, even if a single server is sufficient. The JVM isn’t optimized for dealing with huge amounts of memory, so breaking it into multiple processes on the same system may help, as discussed in Chapter 13.
If application performance is constrained by the limits of the operating system or server hardware, it may be necessary to load balance two or more application servers, as discussed in Chapter 9.
While Tomcat has an HTTP connector, it isn’t optimized as an HTTP server. Bringing Apache or other supported Web servers into the picture would increase performance, as they’re designed for handling only HTTP requests, as discussed in Chapter 9.
A well-configured server is no match for inefficient application code deployed within it. The best weapon in this situation is a clear understanding of the performance of your server when it’s unencumbered with sluggish code. Regardless of what the reality is, the onus is always on you as the server administrator to identify the bottleneck. Thorough pre-application load testing and analysis will allow you to cast off