Case Study


Case Study

This case study gives you a quick peek at the case studies in the rest of the book. Instead of leaving the best for last, we give the whole shebang here. In this case study, we will incorporate JUnitPerf, HttpUnit, and JMeter. Basically, we pound on the site with JMeter at the same time we run an HttpUnit test decorated with a JUnitPerf timed test. We essentially scale up the user until we no longer meet the timed requirement.

Of course, we have not covered JMeter or HttpUnit yet, but this case study will give you a taste of how to use those tools.

HttpUnit Test

Remember that JUnitPerf is a sort of parasite; it needs a host JUnit test to work. Thus, the first part of this case study is writing an HttpUnit test that will test the site. For this case study, we borrowed the code for the pet store HttpUnit Case study, which we discussed in Chapter 13, in a section called "Case Study: Adding an Entity Bean to the Pet Store." We won t cover the code in detail because it we ll present it later; briefly , the test code performs navigation and form entry. Here is a real brief overview of the code to test the site. First, we import the needed HttpUnit and support classes as follows :

 import com.meterware.httpunit.*; import java.io.IOException; import java.net.MalformedURLException; import org.xml.sax.*; import org.w3c.dom.*; import junit.framework.*; 

Next, we define some constants that point to our site:

 protected static String HOST = 'http://localhost/pet/';     protected static String MGMT = 'mgmt/';     protected static String INDEX_PAGE = 'index.jsp';     protected static String SUBCATEGORY_PAGE = 'subcategory.jsp?id=222'; 

In the real world, you may make the host a command-line argument. For this example, we just hard-coded it. Let s look at the navigation tests.

Here is the code to test the main page:

 public void testIndex() throws Exception {         WebConversation wc = new WebConversation();         WebResponse resp = wc.getResponse(HOST+INDEX_PAGE);         WebTable table = resp.getTables()[2];                  assertEquals('# of tables', 3, resp.getTables().length);         assertEquals('message in third table',  'Welcome to AAA Pets',  table.getCellAsText(0,0));     } 

This code verifies that the index page has exactly three HTML tables and displays the message Welcome to AAA Pets in the body of the third table.

Here is the code to test the subcategory page. This test verifies that there are three links corresponding to the products (breeds) associated with the subcategory 'cats':

 public void testSubCategory() throws Exception {         WebConversation wc = new WebConversation();         WebResponse resp = wc.getResponse(HOST + SUBCATEGORY_PAGE);                  assertNotNull('Cat Breed #1', resp.getLinkWith('Calico'));         assertNotNull('Cat Breed #2', resp.getLinkWith('Jaguar'));         assertNotNull('Cat Breed #3', resp.getLinkWith('Siamese'));     } 

Notice that the test checks for the data that was populated with the sample data from the buildDB.xml ant buildfile discussed in the last case study on JUnit. There is also a test on Product (testProduct) that we haven t listed here, because we will discuss it in the HttpUnit case study.

In addition to these navigation tests, three tests examine the form entry on the product management price (testCreate, testDelete, and testEdit). For example, here is the code to test creating a product:

 /**      * Verifies that it can create a new product.      */     public void testCreate() throws Exception {         WebConversation wc = new WebConversation();         WebResponse   resp = wc.getResponse(                                    HOST + MGMT + SUBCATEGORY_PAGE);          WebForm form = resp.getForms()[0];          WebRequest req = form.getRequest('add');         resp = wc.getResponse(req);                  form = resp.getForms()[0];          req = form.getRequest();         req.setParameter('name', 'Persian');         req.setParameter('price', '.00');         req.setParameter('qty', '2');                  resp = wc.getResponse(req);         resp = wc.getResponse(HOST + MGMT + SUBCATEGORY_PAGE);                  assertNotNull('link for 'Persian'', resp.getLinkWith('Persian'));     } 

Now that you get the gist of the HttpUnit test for the Web navigation and product management, let s look at the JMeter test.

JMeter Configuration

For the JMeter configuration, like HttpUnit, we will navigate the site and manage products (add, delete, edit) on the backend. For this test, we will set up the back end with one simulated users that edits a site randomly distributed every 30 seconds with a standard deviation of 5 seconds using the Gaussian random timer. A front-end user will hit the site every three seconds with a standard deviation of five seconds using the Gaussian random timer.

In Chapter 17, "Performance Testing with JMeter," we ll explain in depth how to set up JMeter to perform this magic. For completeness, the following figure shows the configuration panel for adding a test product to the Web site using JMeter.

click to expand

This figure also shows that we have two thread groups, one to perform product management and one to handle front-end navigation of the site similar to testProduct, testIndex, and testSubcategory in the earlier HttpUnit test. Notice that even though the figure shows the Product Management thread group and the Navigation thread group in the same JMeter instance, we run them into two different instances because JMeter sums the times; we want the simulated users that are navigating to be faster than the simulated users that are performing product management.

Putting It All Together

Now that we have the JMeter test and the HttpUnit test, let s require the test that navigates and edits a product to complete in less that five seconds. In addition, we want to determine how many simultaneous users we can have and still meet this threshold. After viewing the Web logs of an existing site, we model the behavior of the existing site and create a JMeter test (from the previous section).

We will decorate the HttpUnit test with a JUnitPerf timed test. The existing HttpUnit test starts its test as follows:

 public static void main(String args[]) {         junit.textui.TestRunner.run(suite());     }               public static Test suite() {         return new TestSuite(HttpUnitTest.class);     } 

When we run this test a few times to make sure that Resin (the servlet engine) compiles the JSPs, we get the following output:

 ...... Time: 1.202   OK (6 tests) 

Keep in mind that we are running Resin, JBoss (the EJB server), and SQL Server 2000 on the same box (a laptop). If you are running in a more distributed environment or a beefier box (like a real server), you may get better results.

Now that we know the baseline test, let s decorate the test with the TimedTest class to execute in five seconds as follows:

 public static void main(String args[]) {         int maxElapsedTime = 5000;         Test timedTest = new TimedTest(suite(), maxElapsedTime);         junit.textui.TestRunner.run(timedTest);     } 

Just for a baseline, we run both thread groups for a while with one thread each and get the following results:

 ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest: 1292 ms   Time: 1.302   OK (6 tests) 

Now, we increase the number of front-end simulated users in JMeter to five. We stop and start the Navigation thread group and rerun the JUnitPerf-decorated HttpUnit test. We get the following results:

 ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest: 1232 ms   Time: 1.232   OK (6 tests) 

We run this test a few times to make sure we get the same results. Now, we crank up the number of users to 50, because we want to see some action. We also notice that we re running the TimedTest several times, so we decide to decorate it with a RepeatedTest as follows:

 public static void main(String args[]) {         //junit.textui.TestRunner.run(suite());         int maxElapsedTime = 5000;         Test timedTest = new TimedTest(suite(), maxElapsedTime);         Test repeatTest = new RepeatedTest(timedTest,5);         junit.textui.TestRunner.run(repeatTest);     } 

The results with 50 users are as follows:

 ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  1382 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  501 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  591 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  500 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  461 ms   Time: 3.435   OK (30 tests) 

As you can see, the overall performance has improved after the first iteration because the test class was created already.

Now it s time to get nasty. We increase the JMeter test to run 5,000 simulated users with the Navigation Thread group. We get the following results:

 ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  3305 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  6670 ms F......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  420 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  301 ms ......TimedTest (WAITING): test.xptoolkit.petstore.httpunit.HttpUnitTest:  330 ms   Time: 11.026 There was 1 failure: 1) test.xptoolkit.petstore.httpunit.HttpUnitTest 'Maximum elapsed time  exceeded! Expected 5000ms, but was 6670ms.'   FAILURES!!! Tests run: 30,  Failures: 1,  Errors: 0 

As we can see, on our lowly laptop we can handle around 5,000 users attacking the site before we start to fail the test and go over five seconds (we are still under five seconds on average). Of course, because everything is running on a laptop, there is no network latency. The same techniques could be used on a real Web application with real requirements to help you determine what hardware you need, how to partition your Web application, and so on.

start sidebar
Real Test

For a real test, you should increase the number of users at a slower pace and have the JUnitPerf test repeat 30 times. If any of the 30 times failed, then the test would fail. This way, passing the test would not be a fluke fluctuation of system performance, but a realistic result.

end sidebar
 

In the case studies throughout the book, you earn the ins and outs of HttpUnit and JMeter. We incorporated JUnitPerf, HttpUnit, and JMeter in this case study to pound on the site with JMeter and measure performance of the site under load; JUnitPerf wrapped a test done in HttpUnit that navigated the site and edited the back end. We essentially scaled the simulated navigation users until we no longer met the timed requirement.




Professional Java Tools for Extreme Programming
Professional Java Tools for Extreme Programming: Ant, XDoclet, JUnit, Cactus, and Maven (Programmer to Programmer)
ISBN: 0764556177
EAN: 2147483647
Year: 2003
Pages: 228

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net