Introducing JMeter

Like some of the best software tools, JMeter was built out of necessity. Stefano Mazzocchi originally created the application for testing the performance of Apache JServ, the forerunner to Tomcat.

JMeter is a Java application for load testing functional behavior. Initially, JMeter only supported the testing of Web applications. However, since its inception, the product has evolved to support the load testing of different parts of the system architecture, including database servers, FTP servers, LDAP servers, and Web Services. JMeter is extensible, so can you can easily add to this list.

JMeter is available from the Apache Software Foundation under the Apache Software License and can be downloaded from Our example uses version 2.0.1.

Unlike HttpUnit, which is purely an API, JMeter is a complete framework and offers a Swing-based user interface for defining and executing load and stress tests. It also provides graphical reports for analyzing test results.

Load testing with JMeter involves using the JMeter user interface to define a number of functional test cases. JMeter uses these test cases to simulate multiple-users accessing the system by running each functional test repeatedly from a number of concurrent threads.

The test cases, the number of threads, and the number of times each test case is executed are all configurable elements of a JMeter test plan. JMeter takes responsibility for executing a test plan against the system under test, spinning up threads as needed in order to generate the required load.

We examine the elements of a test plan and the fundamental concepts behind JMeter by building up a simple example. Continuing from the previous discussion of HttpUnit, we stay with the MedRec system but this time put the application through its paces from a performance perspective.

Testing MedRec with JMeter

The functional tests created in the HttpUnit example verified the behavior of the application for a physician entering a username and password at the login page and navigating to the patient search page. To test the system under load, we use a functional test scenario that goes one step further and initiates a patient search after passing the login page. The objective of the test is to evaluate the patient search under load.

The steps for the performance test scenario are as follows:

  • Access the physician application via the login page.

  • Submit the login form with a valid username and password.

  • From the search page, initiate 100 patient searches.


Like JUnit, JMeter supports the use of assertions in the creation of its functional test cases. These assertions verify the results returned by the system under test. We do not use assertions in the example, but it is good practice to confirm functional behavior in tandem with performance. A well-performing system that has incorrect functionality isn't much use to anyone. Furthermore, unexpected behavior may occur with the system under load.

Figure 15-2 shows the JMeter GUI with a test plan open for executing our load test. Over the next sections, we examine how this test plan is built up and the purpose of each of the plan's elements.

Figure 15-2. JMeter with the MedRec test plan open.

The test plan is comprised of test elements that control the execution of the load test. Elements are added to the test plan as nodes to the tree in the left pane of the JMeter GUI.

The tree structure enables the hierarchical organization of plan elements. Here is a description of the main element types we use in the MedRec test plan:

  • A thread group is the starting point for the test plan and controls the number threads for executing the functional test cases.

  • A sampler sends requests, such as HTTP requests, to the server.

  • A logical controller allows you to instruct JMeter when to issue server requests.

  • A configuration element can add to or modify server requests.

  • A listener provides a view of the data JMeter gathers from a running test plan.

You add new elements to the plan by right-clicking a node in the tree and selecting the Add menu item from the element's context menu. The action presents a menu of all child elements available for selection. We begin by adding a thread group to the top-level test plan node.

Creating a Thread Group

A thread group represents a virtual user, or tester, and defines a set of functional test cases the JMeter virtual user executes as part of the test plan. The JMeter GUI displays configuration options for the thread group element in the right pane.

There are several options of interest for a thread group:

Number of threads.

This option instructs JMeter as to how many threads to allocate for the thread group for running the load test. Specifying the number of threads is the equivalent of defining the number of simultaneous end users who run the test cases.

Ramp-up period.

Specifying a ramp-up period has JMeter create the threads in the thread group gradually over a given duration. For example, you may wish to have 10 threads spinning up over 180 seconds. This option is useful for monitoring performance as the load increases.

Loop count.

The loop count defines the number of times to execute the thread group's test cases. The default is to run continuously. When creating a test plan, it's a good idea to set this value to just a single iteration, as this makes troubleshooting considerably easier.


The scheduler option is a checkbox. In the checked state, additional fields appear in the configuration pane that allow setting of the test's start and end time.

The example contains only a single thread group, but the test plan node supports the addition of many thread groups, each with its own configuration and test cases.

For the example, the number of threads is set at 100 with a ramp-up period of one second. The thread group is set to run continuously.

A JMeter test plan can have many thread groups, each running a different functional test scenario. As the example is concerned with a only single test scenario, our plan contains just the one thread group element.

The Configuration Elements

Configuration elements modify requests sent to the server. They work closely with the sampler elements, which are responsible for sending requests. The MedRec test plan uses two types of this element: an HTTP request defaults and a cookie manager. We add both of these elements to the plan as immediate children of the thread group node.

The HTTP Request Defaults

Like the HttpUnit example, testing a Web application requires submitting HTTP and HTTPS requests to the Web server. The requests sent to place the Web application under load will likely share a common set of configuration options.

JMeter provides the HTTP request defaults element as a convenience for storing these common options. Here are some of the main settings you may wish to set for each request:


This option specifies the protocol for sending each request, either HTTP or HTTPS.

Server name or IP.

Use this field to set the domain name, or IP address, of the server running the system under test.


The path option sets the Universal Resource Identifier (URI) for the page. You can also set default parameters for sending with the request, but it is likely these will be set for each individual request.

Port number.

Set this option if your Web server is listening on a port other than port 80 (which is the default).

The example sets the Server Name or IP to the machine hosting the MedRec application and the Port Number to 7001. All other options remain unset.

Creating a Cookie Manager

The HTTP cookie manager does exactly as its name implies: it manages all cookies sent to the thread group from the Web application. Failing to add a cookie manager to the thread group is the equivalent of disabling cookies within the browser.

For the purposes of the example, the cookie manager element uses the default settings.

Logic Controllers

Logic controllers let you determine when JMeter issues requests. They direct the execution order of test plan elements, and so orchestrate the flow of control for a test.

JMeter provides a range of different logic controllers. Table 15-3 lists those that are available.

Table 15-3. Logic Controllers




Iterates through all child elements and supplies a new value on each iteration.


Makes the execution of child elements conditional.


Executes an alternate child sampler element on each loop of the Controller's branch.


Iterates over each child element a given number of times.


Offers a mechanism for including test plan fragments into the current plan from different locations.

Once Only

Runs elements of the Controller only once per test.


Randomizes the execution order of subcontrollers.


Placeholder for indicating where the HTTP proxy server element should record all data.


Placeholder for organizing elements.


Used for throttling the requests sent by its child elements.


Times the length of time taken for all child elements to run, then logs the timing information.

For the example, the flow of control sees the test case log in to the application and perform 100 patient searches. To achieve this, we use two types of controller: a simple controller and a loop controller.

The Simple Controller

Adding a simple controller provides a placeholder for organizing the sampler elements of the plan. The element type has no configuration options other than a name.

The simple controller in the example has two child elements: an HTTP request sampler for the login page and a loop controller. When JMeter executes the test plan, the login request under the simple controller executes first, followed by the elements of the loop controller.

Adding a Loop Controller

Unlike the simple controller, the loop controller is more than a placeholder. The loop controller states the number of times to iterate through each of the controller's child elements.

For the example, the loop controller's Loop Count property is set to 100. Thus, we have the login request being sent once, followed by 100 search requests. Of course, we still have to add a suitable HTTP request sampler element for the search request to the loop controller, as well as a sampler for the login request as a child of the simple controller. We look at the configuration of these sampler elements next.


Up until now, the test plan doesn't do much. To put the MedRec application under some strain, we need to start sending some actual requests. For this, we need a sampler element.

A sampler submits requests to the target server. JMeter supplies several types of sampler elements, making it possible to test systems other than Web applications. Table 15-4 describes each sampler provided as part of the JMeter installation.

Table 15-4. JMeter Samplers



FTP Request

Sends an FTP get request to the server to retrieve a file.

HTTP Request

Submits an HTTP or HTTPS request to a Web server.

Java Request

Allows you to control any Java class that implements the JavaSamplerClient interface.

JDBC Request

Enables the execution of SQL statements against a database.

LDAP Request

Issues an LDAP request to a server.


Supports sending a SOAP request to a Web Service or allows an XML-RPC request to be sent over HTTP.

Because MedRec is a Web application, we must generate HTTP requests, and JMeter provides the HTTP request sampler for this purpose.

Our test scenario calls for two HTTP request sampler elements, one for making the login request and another for initiating the patient search.

Making a Login Request

The first page accessed as part of the test is the physician login. Navigating past this page requires submitting a login request to establish our security credentials for the session. This process involves issuing the appropriate parameters as part of the request: a username and password.

The login process was covered in the HttpUnit example for the login page using an instance of WebForm in Listing 15-1. JMeter works very differently from HttpUnit, but rather than cover the login page twice, let's leave the discussion on how to submit request parameters with JMeter until we reach the search page.

For now, Figure 15-3 shows the configuration of the sampler for the login page.

Figure 15-3. HTTP login request settings.

Notice that some of the options for the request are blank. These include the Server Name or IP, Port Number, and the Protocol. You can ignore these options because the HTTP request sampler inherits these settings from the HTTP request defaults element created earlier and added to the top-level of the thread group.

Add the sampler for the login requests as a child of the simple controller. To complete the test case, we need a final sampler for the search request.

Submitting a Search Request

The patient search requires an HTTP request element as a child of the loop controller. This element sends the parameters for the patient search. The search page uses an HTML form element for sending search requests, so the sampler needs to mimic the action of the form.

Here is an edited extract of the HTML source for the patient search page showing the form:

 <form name="searchBean"       method="POST"       action="/physician/">     <input type="text" name="lastName" value="">     <input type="text" name="ssn" value="">     <input type="submit" name="action" value="Search"> </form> 

The searchBean form takes either the patient's name or social security number. Our test case uses the name for searching. Figure 15-4 shows the configuration of the HTTP request element for the form.

Figure 15-4. HTTP search request settings.

The HTTP request sampler in Figure 15-4 simulates the sending of the searchBean form as if submitted from a browser. To match the form, you first need to switch the method from a GET, which is the default, to a POST. The HTTP request sampler provides a handy set of radio buttons for making this change.

Next, the path setting for the request must correspond to the value of the action attribute from the form. Set this to /physician/

For the final task, add the form's parameters to the HTTP request. We are searching on the patient's last name, so you can ignore the social security number.

Two parameters are required to initiate a search. The first is the lastName parameter and specifies a string for matching against patient surnames. The second parameter is the value associated with the submit button, in this case the action parameter, which is assigned a value of Search. Don't forget to add this value, or the MedRec application will not know how to handle the request correctly.

The search request completes the setup of the test plan for running the test. However, the test plan is still not complete: we need to tell JMeter how we wish to view the data gathered from running the test. This is accomplished using a listener element.


JMeter uses listener elements for analyzing data gathered from the test. A selection of listeners is available, each providing a different presentation format for the data gathered during the execution of the test. In addition to rendering the data gathered by JMeter, each listener type can log the information collected to file for interrogation after the test plan has completed using other analysis tools.


JMeter stores test results as either an XML document or a comma-separated value (CSV) file. XML is the default, but the CSV format is very useful for importing the data into spreadsheets like Microsoft Excel.

To change the format, locate the entry in the file and set its value to csv.

The listeners elements interpret data for their parent thread group. Add a listener to the thread group node by right-clicking on the node and selecting Add, then Listener, and then choosing from the listener elements available.

The example test plan uses these listeners:

View results tree.

This listener presents a text-based hierarchical view of the requests and responses sent and received during the test.

Aggregate report.

The aggregate report is a text-based listener that displays summary information for each separately named request used in the test.

Graph results.

This listener provides a simple graphical view of the test results, including the average response time and throughput rate.

Refer to the JMeter documentation for a full list of listeners.


The view results tree listener is especially useful when initially building the test plan for determining if the plan is executing as expected. It is of less value when intensive testing is underway due to the volume of data presented.

With the listeners in place, the test plan is complete and ready to run.

Executing the Test Plan

Before running a test, it is highly advisable to save the test plan first. The JMeter engine can potentially spin up a large number of threads, and things can go wrong. Always save the plan before starting the test just in case.

Set the thread group so it loops forever, and start the test from the Run menu with the Start item. You can stop the test with the Stop item from the same menu. By clicking on the different listener elements, you can observe the status of the test while it is in progress.

Analyzing the Results

The listeners display the data gathered for the test. These next sections examine the information presented in the aggregate report listener and the graph result listener. Although JMeter provides a number of other inbuilt listeners, these two are indicative of the type of information JMeter generates for a load test.

Aggregate Report Listener

The aggregate report displays the following summary information for the login and search requests.

  • Number of requests

  • Average response time

  • Minimum response time

  • Maximum response time

  • Error percentage

  • Throughput rate in terms of requests per second

Figure 15-5 shows the results from the aggregate report listener for the example test plan.

Figure 15-5. Aggregate Report Listener.

The report provides a concise and easy-to-read representation of the data in table format. In this case, the MedRec application has maintained a high throughput for the given load.

Because the aggregate report listener displays summary information, it is not possible to see how the Web application behaved during the running of the test. Viewing this information requires a graphical listener.

Graph Result Listener

For a graphical display of the test results, the graph result listener plots several types of performance information, including data samples, the average and median sample times, standard deviation, and throughput.

Showing all of this information in black and white is hard to read, so the graph result listener shown in Figure 15-6 plots only the data samples and the average response times during the test.

Figure 15-6. Graph Result Listener.

The graph has the duration of the test on the X-axis and the server response times for HTTP requests on the Y-axis. The black dots are the individual timings for each request. Ideally, the time taken for the application to handle a request should be uniform. A reasonably tight grouping of the dots represents a consistent response time for each request. The load on the server was quite light, and MedRec coped with the loadillustrated by the X-axis topping out at only 552ms.

The line through the dots is the average. Again, this should be fairly even. Apart from the step curve at the start where the test was ramping up, this is certainly the case.

    Rapid J2EE Development. An Adaptive Foundation for Enterprise Applications
    Rapid J2EEв„ў Development: An Adaptive Foundation for Enterprise Applications
    ISBN: 0131472208
    EAN: 2147483647
    Year: 2005
    Pages: 159
    Authors: Alan Monnox © 2008-2017.
    If you may any questions please contact us: