Section 18.5. Testing Enterprise Components with Cactus


18.5. Testing Enterprise Components with Cactus

As we've seen, JUnit is a very capable tool for defining, running, and reporting on unit tests for basic Java code, such as the data access classes, model classes, and utility classes in your system. Enterprise components, however, present a unique challenge when it comes to unit testing. In order to effectively and accurately test enterprise components , you need to have a way to drive the components (i.e., provide them with the environment settings and inputs that they'll encounter at runtime) and capture the results so that they can be verified against expected results.

At a high level, the two general approaches to testing enterprise components are:


Mock objects model

In this scheme, the enterprise component is instantiated in a standard Java virtual machine, and the inputs and environment settings for the component are simulated using special implementations called mock objects. The advantage of the mock objects model is that it can be applied in any environment and does not require a full enterprise server environment. The disadvantage is that your components are not tested in a true container environment, which could lead to inaccurate or misleading test results in some cases.


In-container testing

In this scheme, the enterprise components run within an actual container environment, and the test environment drives the tests, either from within the server environment or externally through a special client. This approach has the advantage of testing the components in a more realistic environment using an actual J2EE container to process the inputs and exercise the container. This approach has the disadvantage of being more complicated to implement: a container-enabled server has to be available to test the components, and managing the test runs (driving the tests and collecting the results) is complicated by the fact that the tests are running within a server environment.

In this part of the chapter, we discuss Cactus, which is an extension of JUnit managed by the Apache Jakarta project. Cactus is an in-container enterprise testing framework. We are not covering mock objects approaches here because, for the most part, the principles we discussed in the earlier JUnit material can be used to manage mock objects tests. In terms of alternative in-container testing frameworks, we chose to discuss Cactus for the same reasons we chose other enterprise tools in this book: it is popular, it has an active development effort behind it, and it works well with other popular enterprise tools, such as Ant, various IDEs, and J2EE servers of various types.

18.5.1. The Cactus Model and Architecture

Cactus uses a proxy architecture to run server-side tests, as shown in Figure 18-2. A JUnit TestRunner invokes a Cactus-based TestCase, and the TestCase invokes a Cactus proxy on a designated server. The Cactus proxy runs inside the server's web container, taking requests from clients to run specific tests. The Cactus proxy runs the tests on behalf of the client, and the tests exercise various server-side components to test their behavior. The proxy collects the results (success, failure, or error) and returns them to the client to be reported. The Cactus TestCase (the client in this model ) accepts the results and passes them on to the TestRunner, and the TestRunner reports the results just as it does for "regular" JUnit tests.

Figure 18-2. Cactus runtime model


As the figure indicates, testing your server-side code with Cactus involves enhancing your application archive to include the necessary Cactus libraries as well as your Cactus-based unit tests, and then driving these tests through a Cactus proxy using a testing client of some sort. So to use Cactus to test your enterprise applications, you need to understand the process used by the Cactus proxies to drive your test, the different types of proxies that Cactus provides for running different flavors of test cases, the types of test clients provided by Cactus for invoking your server-side tests, how Cactus unit tests are actually written, and how to "Cactus-enable" your enterprise application archive to allow your unit tests to be run.

18.5.1.1. The Cactus test process

Let's look at the deeper details behind the Cactus test process shown in Figure 18-2. These details are essentially the same regardless of what kind of test you're running and which Cactus proxy is handling the tests.

  1. Starting a test run that includes Cactus tests is the same as any JUnit test run. You start a TestRunner and tell it which tests to run.

  2. For each test method to be executed, the TestRunner invokes the runTest( ) method on the corresponding TestCase it's asked to run. Again, this is no different from any other JUnit tests.

  3. If a given test is a Cactus test (i.e., it extends one of the Cactus TestCase base classes as described in "Writing Cactus Tests" later in this chapter), its runTest( ) method checks for a begin(WebRequest) method on the TestCase and tries to execute it before running each test included in the TestCase. This method can be used to initialize any general resources or do any setup required by all tests. The WebRequest argument is a Cactus object that represents the simulated web request that will be passed to your test. You can use this object in the begin( ) method to add parameters such as required headers or URL arguments to the request. Again, begin( ) is called before each test method on the TestCase, so it should do only general-purpose setup applicable to all tests.

  4. If the test method to be run is called testXXX( ), runTest( ) looks for a beginXXX(WebRequest) method on the TestCase and executes it before running the test. This method serves the same general purpose as begin( ) (that is, adding URL arguments, HTTP headers, and the like to the request sent to the test proxy) but is specific to a particular test method.

  5. At this point, the runTest( ) method on the TestCase connects to the appropriate Cactus proxy on the test server and asks it to run the target test method on this TestCase. Any parameters or header values set in the WebRequest object by the begin( ) or beginXXX( ) methods are passed to the server proxy. The proxy that is contacted depends on the type of TestCase being run. If the TestCase implements ServletTestCase, the ServletRedirector proxy is invoked. A JspTestCase is run through the JspRedirector, and a FilterTestCase is run through the FilterRedirector. There are system properties that Cactus uses to locate the test server and the proxies running there. They are detailed in "Running Cactus Tests," later in this chapter.

  6. The Cactus proxy accepts the test run request and creates an instance of the target TestCase on the server. The proxy creates Cactus wrappers for all the necessary web context objects (i.e., servlet tests are given an HttpServletRequest and HttpServletResponse, ServletConfig, and ServletContext). These objects wrap the real context objects generated as part of the web call and ensure that the tests see realistic responses when context information is requested (e.g., the servlet URL available from an HttpServletRequest is representative of the web component being tested, not the Cactus proxy). These web context objects are set as member variables on the TestCase (tHRough the Cactus base classes that they extend).

  7. With the TestCase primed, the proxy now acts as the test driver and calls the setUp( ), testXXX( ), and tearDown( ) methods on the TestCase, in order. The setUp( ) and tearDown( ) methods can be used for server-side initialization and cleanup for each test to be run, as opposed to the begin( ) and end( ) methods, which perform initialization and cleanup on the client side of the Cactus proxy. The test method should exercise the server-side code it's supposed to test (invoke a servlet, JSP, JSP custom tag, servlet filter, EJB, or any other server-side code you want to test). Assertions and failures should be used just as with regular JUnit tests. A failure or uncaught exception generated by the test is picked up by the Cactus proxy and reported back to the caller.

  8. Back on the client side of the Cactus proxy, the results of the test method are collected by the TestRunner. After each test method finishes, the corresponding endXXX(WebResponse) method on the TestCase is run (if it exists), letting you do any test-specific verification of the response returned by the test call. Then the generic end(WebResponse) method is called, allowing you to perform any generic verification of the response. In either of these methods, you can use assertions and failures as you would in your test method. Note that both of these method calls (endXXX( ) and end( )) are done on the TestCase instance on the client side of the proxy.

The three test proxies provided by Cactus alter this general test process in terms of the context objects that they provide to the TestCase, as described next.

18.5.1.2. Cactus test proxies

Cactus provides three types of proxies (called "redirectors" in the Cactus documentation) for driving your server-side tests. Each proxy is implemented as a different type of web component to test server-side code that needs to be run in various web contexts:


Servlet proxy

This test proxy is a servlet that exercises tests from within a servlet runtime context. This provides tests with access to servlet runtime objects, such as HttpServletRequest, HttpServletResponse, ServletConfig, and ServletContext objects. The servlet proxy is useful for testing servlets or for testing server-side code that may be invoked from a servlet, like EJBs or server-side utility code.


JSP proxy

This test proxy is a JSP that exercises tests from a JSP context. This provides tests with access to JSP runtime objects, such as the JSP PageContext. The JSP proxy is useful for testing JSPs, JSP custom tags, or other server-side code that may be invoked from a JSP.


Servlet filter proxy

This test proxy is a servlet filter that exercises tests from a servlet filter context. This provides tests with access to runtime objects available using a servlet filter, such as the FilterConfig. The servlet filter proxy is useful for testing servlet filters or other server-side code that may be invoked from a servlet filter.

In order to make use of these test proxies, they need to be added to the web.xml deployment descriptor of a web module. To deploy the servlet proxy, your deployment descriptor will need the following <servlet> element and a <servlet-mapping> similar to the one shown in Example 18-4.

Example 18-4. web.xml entries for Cactus servlet proxy
 . . . <servlet>     <servlet-name>ServletRedirector</servlet-name>     <servlet-class>         org.apache.cactus.server.ServletTestRedirector     </servlet-class> </servlet> . . . <servlet-mapping>     <servlet-name>ServletRedirector</servlet-name>     <url-pattern>/ServletRedirector</url-pattern> </servlet-mapping> . . . 

To deploy the JSP proxy, you'll need the entries shown in Example 18-5.

Example 18-5. web.xml entries for Cactus JSP proxy
 . . . <servlet>     <servlet-name>JspRedirector</servlet-name>     <jsp-file>/jspRedirector.jsp</jsp-file> </servlet> . . . <servlet-mapping>     <servlet-name>JspRedirector</servlet-name>     <url-pattern>/JspRedirector</url-pattern> </servlet-mapping> . . . 

To deploy the servlet filter proxy, you'll need the entries shown in Example 18-6.

Example 18-6. web.xml entries for Cactus servlet filter proxy
 . . . <filter>     <filter-name>FilterRedirector</filter-name>     <filter-class>         org.apache.cactus.server.FilterTestRedirector     </filter-class> </filter> . . . <filter-mapping>     <filter-name>FilterRedirector</filter-name>     <url-pattern>/FilterRedirector</url-pattern> </filter-mapping> . . . 

Note that the proxy implementations (the classes that implement the servlet and servlet filter proxies and the JSP for the JSP proxy) need to be available in your web archive as well. The classes are included in the main Cactus jar file, and the JSP proxy file is found in the Cactus distribution. The Cactus proxies also require a set of support libraries, which are included in the Cactus distribution. The whole point of having the proxies included in the web archive is to execute your unit tests, so the test classes, plus any application code that they test, also need to be included in the web archive.

You can make all of the Cactus-related enhancements to your web archive by hand, if you choose. If you are using Ant to build your project, Cactus also provides an Ant task called cactifywar that you can use to make these enhancements automatically. Before using the Cactus tasks, you'll need to load them into your buildfile. You can then use the tasks to augment a war file that you've already created:

 <!-- Include the Cactus Ant tasks --> <taskdef resource="cactus.tasks">     <classpath>         <pathelement location="${cactus.home}/lib/junit-3.8.1.jar"/>         <pathelement location="${cactus.home}/lib/cactus-1.7.1.jar"/>         <pathelement location="${cactus.home}/lib/cactus-ant-1.7.1.jar"/>         <pathelement           location="${cactus.home}/lib/commons-httpclient-2.0.jar"/>         <pathelement           location="${cactus.home}/lib/commons-logging-1.0.3.jar"/>         <pathelement location="${cactus.home}/lib/aspectjrt-1.1.1.jar"/>     </classpath> </taskdef> . . . <!-- Create a Cactus-enhanced version of our war file --> <cactifywar srcfile="myapp.war"             destfile="myapp-test.war">     <!-- Add our test classes to the war's classes dir -->     <classes dir="test-classes" includes="**/*.class"/> </cactifywar> 

The <taskdef> shown here loads all of the Cactus-supplied Ant tasks from the Cactus jars. The cactus.home variable referenced in the example must be set to the directory where Cactus is installed.

The invocation of cactifywar in the example takes the web archive myapp.war, adds the Cactus proxy entries to the web.xml file, adds the Cactus libraries to the WEB-INF/lib directory of the archive, puts the JSP proxy jspRedirector.jsp into the root of the web archive, and adds our test classes to the WEB-INF/classes directory in the archive. The cactifywar task is a specialization of the built-in war task included with Ant, so it supports all of the war options. You can use the <lib> or <classes> elements, for example, to add additional classes required by your tests but not included in the original war file. In addition, the cactifywar task supports <servletredirector>, <jspredirector>, and <filteredirector> subelements that can be used to customize the URL mappings used for the proxies. If, for example, you want the JSP proxy to be accessible at the URL /runtests/jspproxy, you could use the <jspredirector> element like so:

 <!-- Create a Cactus-enhanced version of our war file --> <cactifywar srcfile="myapp.war"             destfile="myapp-test.war">     <!-- Add our test classes to the war's classes dir -->     <classes dir="test-classes" includes="**/*.class"/>     <!-- Specify a custom mapping for the JSP proxy -->     <jspredirector mapping="/runtests/jspproxy"/> </cactifywar> 

The cactifywar task adds the appropriate mapping entries in the generated web.xml file and, instead of the default mapping shown in Example 18-5, the JSP proxy is given the following mapping as specified in the <jspredirector> element:

 . . . <servlet>     <servlet-name>JspRedirector</servlet-name>     <jsp-file>/jspRedirector.jsp</jsp-file> </servlet> . . . <servlet-mapping>     <servlet-name>JspRedirector</servlet-name>     <url-pattern>/runtests/jspproxy</url-pattern> </servlet-mapping> . . . 

18.5.2. Writing Cactus Tests

Luckily, all of the proxy-based test processing discussed in the previous sections is handled internally for you by Cactus. Writing a Cactus test is just a matter of extending one of the provided TestCase subclasses and implementing your test methods. Of course, we're talking about testing server-side code, so there's the added complexity of ensuring that the server code sees the same context information that it will in the production environment. But the Cactus framework does a lot to make this as simple as possible.

Cactus provides three subclasses of TestCase that you can use as the base of your own server tests: ServletTestCase, JspTestCase, and FilterTestCase, all within the org.apache.cactus package. Choosing between these TestCase variations is very simple. If your server tests need to run within the context of a servlet, use ServletTestCase; if they need to run in the context of a JSP, use JspTestCase; if they need to run in the context of a servlet Filter, use FilterTestCase. Each Cactus TestCase subclass makes context-specific objects available to the tests. A ServletTestCase has HttpServletRequest and HttpServletResponse objects, the HttpSession associated with the "client," and the ServletConfig. A JspTestCase has access to all of these objects plus the JSP PageContext and the JspWriter output stream. A FilterTestCase has access to the HttpServletRequest and HttpServletResponse objects as well as the FilterConfig and FilterChain objects needed in a servlet Filter context. These context objects can be manipulated by your test code prior to invoking the code being tested, and they can be checked after the test runs to ensure that the expected results were generated.

18.5.2.1. Introduction to Cactus tests: Testing servlets

Let's start by writing tests for a servlet. If you're not already familiar with the servlet programming model and servlets' lifecycle within the web container, you should refer to Chapter 3.

Specifically, we're going to write tests for a servlet that implements the search functionality in the person directory application from Chapter 2. The servlet being tested is PeopleFinderServlet (found in the com.oreilly.jent.people.servlet package in the sample code bundle). It accepts URL arguments that specify the first name, last name, and email search terms, searches the underlying data source for people that match, and generates a web page showing the results. We won't actually list the code for the servlet before writing our tests (which should make extreme programming advocates happy). Instead, we'll write tests according to how we expect the servlet to behave (based on hypothetical functional specifications).

Since we're testing a component that requires a servlet context in order to be tested, we define our TestCase as an extension of ServletTestCase. We'll start with a single test on our TestCase implementation, which is shown in Example 18-7.

Example 18-7. Servlet test example
 // Imports omitted...   public class TestPeopleFinderServlet extends ServletTestCase {     /** Common servlet instance, which serves as our test fixture */     private PeopleFinderServlet mPFServlet = null;       /** Constructors inherited from ServletTestCase */     public TestPeopleFinderServlet(  ) {         super(  );     }     public TestPeopleFinderServlet(String name) {         super(name);     }     public TestPeopleFinderServlet(String name, Test t) {         super(name, t);     }       /** Set up the test fixture.  In this case, create the servlet instance */     protected void setUp(  ) {         // Create the servlet instance.  We also initialize the servlet         // here, so that we know it's done only once.  If any tests         // require an uninitialized servlet for some reason, the         // test will have to create its own servlet instance.         mPFServlet = new PeopleFinderServlet(  );         try {             mPFServlet.init(this.config);         }         catch (ServletException se) {             fail("Unexpected servlet exception while setting up test case: "                  + se.getMessage(  ));         }     }       /** Invoked on the "client side" (within the test runner) before      *  asking the proxy to run the validSearch test */     public void beginValidSearch(WebRequest request) {         // Add a search parameter         request.addParameter(PersonDAO.FIRST_NAME, "John");     }       /** Test method, run on the server side by the proxy.  This test ensures      *  that a valid response is generated from a valid search request */     public void testValidSearch(  ) {         // Simply invoke the servlet's doGet(  ) method, using         // the request constructed in the begin method         invokeGet(  );     }       /** Invoked on the "client side" (within the test runner) after the      *  proxy runs the validSearch test and sends a response to      *  the test runner */     public void endValidSearch(WebResponse response) {         // Since this was a valid search, there should be no "Error"         // text in the response HTML, and a "Search Results" header         // should be included         assertTrue((response.getText(  ).indexOf("Error:") < 0) &&                    (response.getText(  ).indexOf("Search Results") >= 0));     }       /** Tear down the test fixture.  Invoke the servlet's destroy method,      *  to ensure any servlet cleanup occurs */     protected void tearDown(  ) {         mPFServlet.destroy(  );         mPFServlet = null;     }       // Utility methods used by the tests     /** Invoke the get method on the servlet in our fixture.  If any      *  exceptions are generated, fail the test.*/     private void invokeGet(  ) {         try {             mPFServlet.doGet(this.request, this.response);         }         catch (IOException ioe) {             fail("Unexpected IO exception: " + ioe.getMessage(  ));         }         catch (ServletException se) {             fail("Unexpected servlet exception: " + se.getMessage(  ));         }     } } 

In our TestPeopleFinderServlet test case, we have a single test method, testValidSearch( ), that exercises the basic functions of the servlet and tests for an appropriate response. To drive this test, we've written a beginValidSearch( ) method that adds a set of URL arguments to the web request. As depicted earlier in Figure 18-2, the beginValidSearch( ) method is invoked on the "client" side of the test session, prior to making the HTTP request to the Cactus proxy. Our beginValidSearch( ) method has the required method signature as dictated by Cactus: the base of the method name (after the "begin") matches the base of the test method name (after the "test"), it is public, has a void return type, and takes a single WebRequest argument. The WebRequest encapsulates the details of the actual HTTP request that is generated and passed on to the Cactus proxy when it is asked to invoke the test method. In our beginValidSearch( ) method, we add a URL argument whose name is taken from the PersonDAO.FIRST_NAME constant (the name that the servlet uses to look up this argument) and with a value of "John". According to our understanding of the servlet being tested, this should be considered a valid search request.

The testValidSearch( ) method itself simply invokes the servlet being tested, using the invokeGet( ) utility method we've defined on our test case. In the invokeGet( ) method, we use a test fixture consisting of an instance of the PeopleFinderServlet class and simply invoke the servlet's doGet( ) method, passing along the HttpServletRequest and HttpServletResponse objects available in the ServletTestCase member variables. The request will include the URL arguments that we added in our beginValidSearch( ) method, so we just need to get the request processed by our servlet.

This approach to testing a servlet (directly constructing a servlet instance and exercising its methods) may seem a bit odd to those familiar with J2EE component development. Components are always constructed by their corresponding containers, and the component methods are invoked by containers while being managed at runtime. But handling servlets and other components directly is useful and sometimes necessary when writing Cactus tests. If you need to invoke utility methods on your components with specific input arguments, for example, you'll need direct access to a component instance, and you'll need to construct one yourself to make that happen since web containers do not directly expose servlets and JSP components for programmatic access. Another reason to use a direct servlet instance is to allow you to control the initialization of the component. You can adjust the contents of the ServletConfig object passed into the servlet's init( ) method before its request-handling methods are executed.

You can, however, choose to exercise a servlet that has been deployed in the standard way within a web container as long as it's the same web container in which the Cactus tests are being run. First, you'll need to ensure that the servlet being tested has been deployed in the application containing the Cactus proxy and the Cactus tests being run. This entails the usual servlet deployment detailsthe web deployment descriptor (web.xml) needs to have a servlet entry and at least one servlet-mapping enTRy for the servlet being tested. Then, in your Cactus test, during the execution of one of the testXXX( ) methods on the server side of the proxy, you can forward the HTTP request and response pair provided by the proxy to the servlet by using the RequestDispatcher available through the ServletContext. If our PeopleFinderServlet has been deployed and mapped to the relative URL /search-servlet, for example, we can use this approach to invoke the servlet managed by the container:

 try {     RequestDispatcher dispatcher =          config.getServletContext(  ).getRequestDispatcher("/search-servlet");     dispatcher.forward(request, response); } catch (IOException ioe) {     fail("Unexpected IO exception: " + ioe.getMessage(  )); } catch (ServletException se) {     fail("Unexpected servlet exception: " + se.getMessage(  )); } 

This approach offers you the advantage of fidelity: the servlet component is being tested while operating in the same runtime environment that it will experience when it is "live." But you can use this approach only for tests that can be executed using an HTTP request to the web component.

Taking a look at the initialization of our Cactus test fixture, we see that, just as with any JUnit test, we initialize our test fixture in the setUp( ) method and clean up the fixture in the tearDown( ) method. In setUp( ), we initialize our servlet instance by constructing an instance and then calling its init( ) method, passing in the ServletConfig object member variable provided to all ServletTestCase instances. The Cactus proxy initializes this and other servlet context variables that our TestPeopleFinderServlet test case inherited from the Cactus ServletTestCase class. We invoke the servlet's init( ) method in our fixture setup routine to ensure that it is properly initialized before exercising its functionality in our tests. In a sense, we're (partially) taking on the role of the servlet container here and going through the expected lifecycle callbacks of the servlet component. In the tearDown( ) method, we invoke the servlet's destroy( ) method and then remove our reference to the instance so it can be garbage-collected.

Finally, in our endValidSearch( ) method, we check the response from the servlet against what we expected, given the request that we made of it. This method is executed on the "client" side of the Cactus proxy, after the test method has been run. Each endXXX( ) method takes a single WebResponse argument, which is a Cactus object that encapsulates the results returned to the "client" from the proxy on the server. In our case, we check to make sure that the returned HTML does not contain the text string Error:, which we know (from our hypothetical functional specifications for the servlet) should be included in any response to an invalid or failed search request. We also check to be sure that the response HTML includes a Search Results banner, as expected in a valid search result.

In this example test, we made use of both the beginXXX( ) and endXXX( ) methods to initialize the request parameters and check the response, respectively. In some cases, one or both of these will not be necessary. For example, our test servlet is supposed to treat a lack of search arguments as an invalid search. We can test for this behavior by adding the following test method to our TestPeopleFinderServlet:

 public void testNoSearchArguments(  ) {     // Invoke the doGet method on the servlet using the request     // that was generated through the proxy.  This request should     // include no search arguments since we have not added any in     // a begin method.     invokeGet(  ); }   public void endNoSearchArguments(WebResponse response) {     // The response should contain the key phrase "Invalid search",     // indicating a bad search request     assertTrue(response.getText(  ).indexOf("Invalid search") >= 0); } 

Note that we do not have a beginNoSearchArguments( ) method defined here because we do not need to add anything to the servlet request before running the test. In fact, we explicitly avoid adding any URL arguments to trigger the functionality being tested (a bad search, lacking any search arguments).

In other cases, we may not need either the beginXXX( ) or endXXX( ) method for a particular test. A typical example is tests that exercise utility methods on a component. Our PeopleFinderServlet, for example, has a static utility method named personToHTML( ) that accepts a Person Java bean as an argument and returns an HTML snippet that represents that person's information. We don't need to prepare any request information, or test any response information, to perform this test. We simply need a standard test method that creates a sample Person object, passes it through the utility method, and checks the results:

 /** Test the HTML conversion util on PeopleFinderServlet */ public void testPeopleToHTML(  ) {     // Create a sample person     String fname = "Andy";     String lname = "Long";     String eAddr = "andy.long@buffaloimports.org";     Person samplePerson = new Person(  );     samplePerson.setFirstName(fname);     samplePerson.setLastName(lname);     samplePerson.addEmailAddress(eAddr);     // Create the expected HTML output from the converter     StringBuffer sampleHTML = new StringBuffer(  );     sampleHTML.append("<tr bgcolor=\"#EEEEEE\"><td>")         .append(lname).append("</td><td>").append(fname)         .append("</td><td>").append(eAddr).append("</td></tr>");     // Now apply the method to our Person and validate the results     String result = PeopleFinderServlet.personToHTML(samplePerson);     assertEquals(result, sampleHTML.toString(  )); } 

In this specific test, we don't really need the Cactus framework to allow us to exercise the component within a servlet container. We could run this test with a standard JUnit TestCase since we're not making use of any of the servlet context parameters. But since you typically want to test functionality that requires a servlet context (that includes, for example, request and response data), you may find yourself mixing simple tests like this with more complicated tests that require the full proxy framework of Cactus.

18.5.2.2. Testing JSPs

If you're not already familiar with programming JSPs and their lifecycle within the web container, you should refer to Chapter 4.

Testing JSPs using Cactus is very similar to testing servlets. The same proxy scheme is used to set up context information, such as request parameters and cookies that are needed by the JSP to function properly. The JSP under test is invoked by forwarding the HTTP request to the page from within your test case, similar to our servlet test example in the previous section. To support JSP-specific context parameters, such as the various scopes of page variables (such as page, session, and request), Cactus provides a JspTestCase that can be used as the base class for JSP-related unit tests. The proxy that corresponds to this type of test operates as a JSP page and provides wrapped JSP context parameters (specifically, the JSPWriter and PageContext) to the test, in addition to the standard servlet context parameters.

The test shown in Example 18-8 demonstrates a typical JspTestCase. In this case, the TestSearchJSP test case is testing the proper behavior of the search.jsp page from our PeopleFinder example application. The test itself is almost identical to the valid search test in our TestPeopleFinderServlet from Example 18-7. This isn't surprising since both components are web components and they both function identically in this regard (i.e., they expect the same inputs and generate roughly the same output).

Example 18-8. Sample JSP test case
 // Imports omitted...   public class TestSearchJSP extends JspTestCase {     // Constructors omitted...       /** Set up the request on the "client" side of the proxy */     public void beginValidSearch(WebRequest request) {         // Add a search parameter to the request         request.addParameter(PersonDAO.FIRST_NAME, "John");     }     /** Pass the valid search request to the page */     public void testValidSearch(  ) {         // Invoke the target page by forwarding the request         invokePage(  );     }       /** Check for the appropriate response on the "client" side of the proxy */     public void endValidSearch(WebResponse response) {         // Since this was a valid search, there should be no "Error"         // text in the response HTML         assertTrue((response.getText(  ).indexOf("Error:") < 0) &&                    (response.getText(  ).indexOf("Search Results") >= 0));     }       /** Utility method that forwards the request to the target page */     private void invokePage(  ) {         try {             // Forward the request to the search JSP             this.pageContext.forward("/search");         }         catch (ServletException se) {             fail("Unexpected servlet exception: " + se.getMessage(  ));         }         catch (IOException ioe) {             fail("Unexpected I/O exception: " + ioe.getMessage(  ));         }     } } 

One noticeable difference is in how the JSP is invoked in our invokePage( ) utility method. Here, we're using the JSP PageContext object to forward the request from the Cactus proxy to the target JSP, using the mapped URL /search where we expect the JSP to be deployed. In our TestPeopleFinderServlet test case, we showed two ways we could pass a request to a target servlet. We could invoke the forward( ) method on the RequestDispatcher retrieved from the ServletContext, or we could manually construct a servlet instance, initialize it, and invoke its doGet( ) or doPost( ) method. The first approach (using RequestDispatcher.forward( )) is roughly equivalent to the PageContext.forward( ) approach, and we could do the same thing here in our JspTestCase. When testing a JSP, however, it's not convenient for us to manually "construct" an instance of the target JSP and invoke it programmatically. The web container is responsible for parsing the JSP, converting it into a servlet instance, and sending the resulting web component the web requests. So the only viable approach for testing JSPs is to forward the test request directly to the deployed JSP through the web container.

If your tests require access to only the standard servlet context parameters (request, response, session data, and the ServletConfig), you can use a ServletTestCase to test your JSPs just as well as a JspTestCase. In fact, JspTestCase uses ServletTestCase as its base class. But if your tests need to modify the variables in the various scopes available to JSPs, you'll need access to the JSP PageContext, which is available only to JspTestCase instances.

18.5.2.3. Testing JSP custom tags

If you're not familiar with programming JSP custom tags and their lifecycle within the web container, you should refer to Chapter 4.

JSP custom tags are a common part of J2EE web application architectures since they allow you to cleanly inject functionality into JSPs in the form of page tags that invoke Java code.

Several approaches to testing custom tags using Cactus are available. You can use a JspTestCase, such as those shown in the previous section, to exercise JSPs that contain the tag under test. In your endXXX( ) methods, you can test the response for specific content that the tag is expected to generate. The JSPs that you use can be either actual application pages or JSPs created specifically to test the target tags. Using specially constructed JSPs has its advantages: you can simplify the JSP to include only the elements necessary to test the tag, and you isolate your tests from the structure of your application's design.

You can also test a custom tag more directly, but in a slightly more artificial fashion, by programmatically constructing a tag instance within a JspTestCase. Continuing with our progressive PeopleFinder example, the search page uses the SearchTag custom JSP tag to actually perform the search. We can directly test our custom tag using a JspTestCase, as shown in the TestSearchTag test case shown in Example 18-9.

Example 18-9. Test case for a custom JSP tag
 // Imports omitted...   public class TestSearchTag extends JspTestCase {     /** An instance of the SearchTag being tested */     private SearchTag mTag = null;       // Constructors omitted...       /** Initialize our test fixture */     public void setUp(  ) {         // Make a tag instance         mTag = new SearchTag(  );         // Set its JSP context         mTag.setPageContext(this.pageContext);     }       /** Clean up our test fixture */     public void tearDown(  ) {         // Call the context popBody(  ) method, which will ensure that         // it emits its generated content from the tag, if any         this.pageContext.popBody(  );         // Release the tag reference         mTag = null;     }       public void testValidSearch(  ) {         // Set the name of the page variable where the results are to be         // stored         mTag.setVarName("people");         // Set the first name search attribute on the tag         mTag.setFirstNamePattern("John");         // Invoke the tag's doStartTag(  ), performing the search         mTag.doStartTag(  );         // If the tag performed as expected, there should be no         // error message in "people-error", and a non-null collection         // in the "people" page variable         assertNull(this.pageContext.findAttribute("people-error"));         assertNotNull(this.pageContext.findAttribute("people"));     } } 

In our test, we're using a test fixture that includes an instance of a custom tag, stored in the member variable mTag. We set up the fixture in our setUp( ) method by constructing an instance of the tag and setting its JSP PageContext to the context provided to us by the JSP Cactus proxy. Again, this emulates (partially) the management of the tag performed by a JSP engine within a web container. In our tearDown( ) method, we clean up our fixture by invoking the popBody( ) method on the PageContext. This ensures that the content generated by our tag, if any, is flushed into the response, so that it can be checked for validity. In our case, the SearchTag doesn't generate any content, so the call to popBody( ) isn't necessary, but we include it here for completeness.

Our tag, SearchTag, performs a search against a set of person data using the search attributes defined on the tag. It stores the results as a collection of Person objects in the page variable specified by the varName attribute on the tag. If an error occurs, the tag generates an error string and stores it in a page variable named <varName>-error, where <varName> is the value of the varName attribute. Our testValidSearch( ) test method tests the behavior of the tag when it's given a valid search request. We set the value of the varName attribute, and then we set the firstNamePattern attribute to "John". With the attributes set, we invoke the tag's doStartTag( ) method to exercise the tag's search functions. After the doStartTag( ) method returns, we test the results directly by checking the page variables through the PageContext. Since we're strictly testing the functionality of the tag and not any surrounding JSP, there's no need to set request arguments in a beginValidSearch( ) method. And since the results we need to test are JSP page variables that are available immediately after the tag is finished processing, we can check the results directly in our test method and have no need to check the HTTP response in the endValidSearch( ) method.

18.5.2.4. Testing servlet filters

If you're not already familiar with servlet filters and their role in the management of servlets and JSPs, you should refer to Chapter 3.

Tests for servlet filters require their own Cactus proxy implementation and Cactus TestCase base class since filter tests need access to the filter context parameters, similar to servlet tests and JSP tests and their respective context parameters. Filter tests implement the FilterTestCase base class, which provides the tests with wrappers for the FilterConfig and FilterChain objects associated with the filter request.

We're going to write a test for a LoginFilter that is used to authenticate users as they request web resources. The code for the filter isn't shown here (again, in part to save space, in part to cater to orthodox extreme programming advocates). But the expected functionality of the filter is fairly simple. The filter maintains the authenticated ID of the user in a session variable, whose name is given by the value of the LoginFilter.AUTHN_ID_VAR constant. If a web request is picked up by the filter and the corresponding session does not have this variable, the filter checks for two request parameters, LoginFilter.USER_VAR and LoginFilter.PASSWORD_VAR, and these are compared against a set of username/password data in an attempt to authenticate the user. If either of these checks succeed (the session variable is present or the login parameters check out), the filter passes the request down the filter chain, allowing the user access to the target page or web component. If the session variable isn't present and the request parameters are not present in the request or if they don't match any known users, it's assumed the user still needs to be logged in, and he's redirected to a login page whose URL is given by the filter's loginURI property. This login page is expected to prompt the user for a username and password and redirect him back to the original resource that was requested.

Based on this description, the TestLoginFilter test in Example 18-10 partially tests the functionality of the filter when a user makes a request without any login credentials. The test case uses an instance of the LoginFilter as its test fixture, constructing and initializing the filter in its setUp( ) method and destroying it in its tearDown( ) method. While initializing the filter, we invoke its init( ) method, passing in the wrapped FilterConfig inherited from FilterTestCase and initialized by the Cactus proxy.

Example 18-10. Test case for a servlet filter
 // Imports omitted. . .   public class TestLoginFilter extends FilterTestCase {     /** An instance of the filter under test */     private LoginFilter mFilter = null;     // Constructors omitted. . .       /** Set up for tests */     public void setUp(  ) {         // Construct and initialize the filter instance         mFilter = new LoginFilter(  );         try {             mFilter.init(this.config);         }         catch (ServletException se) {             fail("Unexpected servlet exception: " + se.getMessage(  ));         }     }       public void tearDown(  ) {         mFilter.destroy(  );         mFilter = null;     }       /** Ensure that the filter performs a redirect when no authentication      *  credentials are present in the user session, and no login username and      *  password are provided as request parameters */     public void testLoginRedirect(  ) {         // Ensure that the user session is clear of our authentication         // credentials         HttpSession session = this.request.getSession(  );         if (session != null) {             session.removeAttribute(LoginFilter.AUTHN_ID_VAR);         }         try {             // Invoke the filter             mFilter.doFilter(this.request, this.response, this.filterChain);         }         catch (IOException ioe) {             fail("Unexpected I/O exception while invoking filter: "                  + ioe.getMessage(  ));         }         catch (ServletException se) {             fail("Unexpected servlet exception while invoking filter: "                  + se.getMessage(  ));         }     }       public void endLoginRedirect(WebResponse response) {         // The response should contain an HTTP "MOVED TEMPORARILY"         // status with a redirect         assertEquals("Unexpected status code seen in response",                      response.getStatusCode(  ),                      HttpServletResponse.SC_MOVED_TEMPORARILY);     } } 

In our single test, testLoginRedirect( ), we ensure that the filter is given a clean request by clearing the session of the authentication variable, if it's present from a previous user request (e.g., another earlier test method may have set the session variable). Then we pass the request and response objects to the filter's doFilter( ) method, along with the wrapped FilterChain inherited from FilterTestCase. If the filter operates as expected, this should generate a redirect status in the response, which we check in the endLoginRedirect( ) method. Note that we do not need to define a beginLoginRedirect( ) method in this case because there are no request parameters to initialize. In fact, for the purposes of this test, we need the request to have none of the login request parameters in order to force the redirect we're testing.

In our example test, we're programmatically creating an instance of the filter being tested and exercising its methods directly. Just as with servlets and JSPs, we also have the option of testing a live filter that has been deployed in the web container. The advantages and disadvantages of the two approaches are similar to the servlet case, and there are times when one or the other approach will be needed, depending on the test. Using a direct filter instance allows you to exercise any methods on the filter, including utility methods that are not directly called by the web container. Exercising a live filter, on the other hand, gives you more fidelity in terms of seeing the true behavior of the component in its natural context. But there are some complications to consider, such as configuring the filter so that it can be tested cleanly, without interfering with the testing of other components in the application. In our PeopleFinder application, for example, our LoginFilter would normally be mapped to "cover" the entire application to ensure that only authorized users are using the functions in the system. But for the purposes of testing, this would complicate our testing of the web components (servlets and JSPs). With the LoginFilter in place, we'd have to ensure that all of our servlet and JSP tests attach valid authentication credentials to the test requests by initializing the session variable checked by the LoginFilter. This complicates our other tests, forcing them to include details about another part of the system. So for testing purposes, we would likely set up a special test configuration for the application, where the various web components being tested are deployed without the LoginFilter protecting them.

Another point to consider about servlet filters and in-container testing: take care that your application's servlet filters are not applied to the Cactus test proxies. If, for example, you define a filter mapping in your web.xml file that covers all URLs in the application, like so:

 <filter-mapping>     <filter-name>LoginFilter</filter-name>     <url-pattern>/*</url-pattern> </filter-mapping> 

You will also be inserting this filter in front of the Cactus test proxies that will be added to your application when it's deployed for testing. Your filters may interfere with the proper functioning of the Cactus proxies, so you should be careful to control your mappings so application filters are applied only to your application components. As discussed in "Cactus test proxies" earlier in this chapter, you can control the way Cactus proxies are mapped to URLs in your application, so you should be able to find a way to manage the various mappings appropriately.

18.5.2.5. Testing EJBs and component types

We've seen how various J2EE web components (servlets, filters, JSPs, JSP custom tags) can be tested using the Cactus framework. Cactus can also be useful for testing other J2EE components since it provides a way to run a test from inside a container environment. This ability to test from within a container can be critical in many situations. There may be functionality on the components that can be exercised only from within a container, for example. The most extreme example of this is EJBs that expose only a local client interface. It's not possible to test this interface from a standalone JVM running standard JUnit tests because a remote JVM will need a remote interface. You could create remote interfaces for the purposes of testing, but then you wouldn't really be testing the component as it will be used in the production environment. You'll also be adding new, unnecessary code to your system that is simply there to support testing.

When writing a Cactus test for other J2EE components and for general server-side utility classes, you'll need to decide which Cactus base class to use for your test. In most cases, the choice isn't critical. Generally, when testing an EJB component, for example, you can extend your test from ServletTestCase, FilterTestCase, or JspTestCase equally well. Any of these will allow your tests to be driven by a server-side Cactus proxy. If your components or other server code need to be driven from a particular type of web component, though, you'll need to use the corresponding type of test case as the base classes for your tests so that you can properly initialize and exercise the component. A real-world example of this is an authorization service that is invoked from within a servlet filter in the system. The authorization service could be designed to check the user credentials and, based on their assigned roles and permissions, route the user to different areas in the application based on settings provided in the filter's initialization parameters. Any tests you wrote for this service would be best written as an extension of FilterTestCase so that all the necessary context parameters are available to pass to the service.

As a demonstration of testing other J2EE code with Cactus, Example 18-11 shows TestPeopleFinderEJB, which tests the EJB from the PeopleFinder sample application.

Example 18-11. Sample Cactus test for an EJB component
 // Imports omitted...   public class TestPeopleFinderEJB extends ServletTestCase {     private PeopleFinder mFinderBean = null;     private static Logger sLog =          Logger.getLogger(TestPeopleFinderEJB.class.getName(  ));     // Constructors omitted. . .       /** Create an instance of the EJB component under test */     protected void setUp(  ) {         // Initialize an EJB reference         String beanName = "java:comp/env/ejb/PeopleFinder";         try {             InitialContext ctx = new InitialContext(  );             PeopleFinderHome home =                  (PeopleFinderHome)PortableRemoteObject.narrow(                      ctx.lookup(beanName), PeopleFinderHome.class);             this.mFinderBean = home.create(  );         }         catch (NamingException ne) {             fail("Unable to lookup EJB component using name '" +                  beanName + "': " + ne.getMessage(  ));         }         catch (CreateException ce) {             fail("Unable to create PeopleFinder bean: " + ce.getMessage(  ));         }     }       /** Clean up our EJB instance */     protected void tearDown(  ) {         try {             mFinderBean.remove(  );             mFinderBean = null;         }         catch (RemoveException re) {             fail("Removal of EJB in fixture failed: " + re.getMessage(  ));         }     }       // Verify that the EJB correctly rejects a query with no arguments     public void testNoSearchArguments() {         // Try to pass in an empty parameter set         try {             mFinderBean.findPeople(new SearchArg[0]);             fail("PeopleFinder allowed an empty search");         }         catch (InvalidSearchException ise) {             sLog.info("PeopleFinder correctly rejected an empty search");         }         catch (PersistenceException pe) {             fail("Unexpected persistence exception: " + pe.getMessage());         }     } } 

Our test case extends ServletTestCasenot for any particular reason, just to allow our test to be driven by a server-side Cactus proxy. We could just as easily have extended our test from JspTestCase or FilterTestCase.

Our test maintains a handle on a PeopleFinder EJB component in its fixture. The setUp( ) method looks up the EJB home interface through the server's JNDI context and then creates an EJB reference. Our tearDown( ) method invokes the remove( ) method on the reference and then sets the fixture variable to null. Our only test method, testNoSearchArguments( ), verifies that the EJB correctly rejects a search request that has no search parameters included. We have no beginNoSearchArguments( ) and endNoSearchArguments( ) methods because the functionality we're testing doesn't require any particular web context information like request parameters or session data.

18.5.2.6. Testing asynchronous code

Even with the Cactus proxy architecture and its ability to test J2EE code running in a container environment, some distributed and asynchronous architectures are difficult to test programmatically using a unit test framework. Typical examples in the J2EE realm are JMS message handlers, message-driven EJBs, and message-oriented SOAP services implemented using JAX-RPC or SAAJ. In these situations, a message or request of some kind is generated and sent to the target handler. The handler processes the message or request and generates a response or some side effects. In a typical test scenario, you'll want to generate specific inputs and verify the results of the handler's processing of those inputs. It's problematic in these cases because of the asynchronous and distributed nature of the handlers. A JMS message handler, for example, will take an indeterminate amount of time to be notified of the message by the JMS service. Once it has been notified, it may generate side effects on the server or generate a response message. If the only result generated by the JMS message handler is a response message, we can simply poll for the response message within the Cactus test and verify its structure:

 . . . public void testJMSResponse(  ) {     QueueReceiver recvr = . . .;     QueueSender sender = . . .;     Message testMsg = . . .;     // Send the test message     sender.send(testMsg);     // Poll for the expected response     Message response = recvr.receive(  );     // Validate response     . . . } . . . 

But even this simple case is complicated a bit by the fact that the handler may fail to generate a message, meaning that we need to give up waiting for a message after some period of time and then fail the test. But how long do we wait? Is 30 seconds too long? Two minutes? Sometimes these questions can be answered by using maximum delay thresholds that have been defined as requirements on the system. By definition, if a response is not received in this time, the handler has failed. We're mixing functional testing and performance testing in this situation, but at least we can define our test in a concrete way and validate the results.

If the handler generates side effects only on its side of the asynchronous interaction, your test must have direct access to the side effects and be able to verify them. In some cases, this is possiblefor example, when the JMS handler makes modifications to data stored in an RDBMS that we can check from the test code. But there's still the issue of knowing when to check the results. How do we know when the handler has received the message and finished processing it?

There are a number of approaches to dealing with this. The right answer for you depends on how and what you need to test with regard to your asynchronous, distributed subsystems. A couple of possible approaches are:


Use mock objects.

In the JMS case, for example, rather than creating a real JMS message and delivering it to your handler through a real JMS service, you can create a mock Message and invoke the onMessage( ) callback on the handler directly. This will unit test the handler code, which is usually the goal. And the removal of the JMS service from the picture converts the test scenario from an asynchronous situation to a synchronous one (when the onMessage( ) method completes, the handler is finished and all of its side effects should be committed).


Use white box testing instead of black box testing.

Rather than testing the functioning of the handler as a black box, pull apart the handler logic and test its individual elements as a set of white box tests. If a SOAP service operation, for example, invokes objects A, B, and C to respond to a particular request, write a series of unit tests that exercises these same server-side objects in the same way as the SOAP service. This also eliminates the asynchronous communications from the mix and tests the equivalent functionality, but at the cost of managing your test suite against the implementation details to ensure that your white box tests remain consistent with the unit's implementation.

18.5.3. Running Cactus Tests

Cactus tests are run using a JUnit TestRunner, just like standard JUnit tests. The difference is that the TestRunner needs to be run in such a way that it "knows" where to find the Cactus server-side proxies so that the Cactus tests can pass their test requests along to be run in the server container(s).

Cactus uses a set of Java properties to define the network location of its proxies. These properties are set in a cactus.properties file that's read from the classpath used to run the TestRunner, or they can be specified using command-line arguments to the JVM. The properties that Cactus supports are shown in Table 18-1.

Table 18-1. Cactus runtime properties

Property name

Description

cactus.contextURL

The base URL for the various Cactus web proxies. This property is required.

cactus.servletRedirectorName

The name to which the Cactus servlet proxy is mapped in the test application's web.xml deployment descriptor. The default value for this is "ServletRedirector", which is the name used by default by the cactifywar Ant target when it adds the Cactus proxies to your web.xml file.

cactus.jspRedirectorName

Same as above, but this specifies the name for the JSP proxy. The default value is "JspRedirector".

cactus.filterRedirectorName

Same as above, but this specifies the name of the servlet filter proxy. The default value is "FilterRedirector".


When using a standalone JUnit TestRunner (such as the text-based TestRunner we used in our JUnit examples or a TestRunner integrated with an IDE like Eclipse), we can set these properties in a cactus.properties file and ensure that this properties file is in the classpath when we invoke the TestRunner. The cactus.contextURL property is the only one that is strictly requiredthe others are needed only if you used custom mappings for the Cactus proxies when you added them to your test application. So a minimal cactus.properties file would be:

 # Set the base URL for the Cactus proxies cactus.contextURL = http://localhost:8080/jent-peopleFinder-test 

If any Cactus tests are run with this properties file, the ServletTestCase tests will attempt to connect to the servlet proxy at the URL http://localhost:8080/jent-peopleFinder-test/ServletRedirector, the JspTestCase tests will talk to the JSP proxy at http://localhost:8080/jent-peopleFinder-test/JspRedirector, and so on.

With this properties file in our classpath somewhere, we can invoke Cactus tests from the command line just as we did with regular JUnit tests:

 > java junit.textui.TestRunner com.oreilly.jent.people.servlet.TestPeopleFinderServlet ... Time: 5.763   OK (3 tests) 

Of course, we still need to ensure that the application server running our test application (and the Cactus proxies) is up and running before we do this.

This same approach can be used to run Cactus tests using JUnit facilities built into IDEs like Eclipse. Simply configure the JUnit tool to include the proper cactus.properties file in its runtime classpath, and it will be able to drive your server-side Cactus tests along with standard JUnit tests.

18.5.3.1. Running Cactus tests with Ant

We can also use the standard junit Ant task to drive our Cactus tests if we want. Just as with the other TestRunner examples, we need to make sure that the appropriate cactus.properties file is in the classpath of the junit task when we invoke it:

 <target name="run-server-tests-junit">     <junit printsummary="on"            showoutput="true">         <classpath>             <!-- Add the application classes and the test classes -->             <path ref/>             <path location="${java.test.classes.dir}"/>             <!-- Add the directory containing the cactus.properties -->             <path location="${config.dir}"/>         </classpath>         <test name="com.oreilly.jent.people.servlet.TestPeopleFinderServlet"/>     </junit> </target> 

Notice that we're including all of the application classes (specified elsewhere as the value of the java.test.compile.classpath path), and the test classes themselves (specified using the java.test.classes.dir directory, where the test classes are compiled). Again, before invoking the junit task, we'll need to ensure that the test application server is up and running. If we have Ant targets defined to start the test server, we could add those as dependencies on this testing target.

As an alternative to driving server-side Cactus tests using Ant, Cactus includes the cactus Ant task. This task essentially plays the same role as the junit task (in fact, cactus is implemented as an extension of the junit task). The difference is that cactus also starts up the test application server before running the Cactus tests you specify and shuts down the application server after it runs the tests. This task is especially useful when you're using one or more dedicated application servers for testing, and when the application servers are running locally, allowing the cactus task to start and stop them.

Of course, in order to start and stop the application server running the test application, the cactus task needs to be told about the server configuration (such as what type of server it is and where it is installed). The cactus task supports several popular server types, including JBoss, WebLogic, and Tomcat. It does this with specific subelements that you use to specify the server parameters. As an example, the following target uses the cactus task to drive a set of tests run in a local JBoss server:

 <target name="run-server-tests-cactus">     <cactus earfile="${basedir}/${ant.project.name}-test.ear"             printsummary="yes">         <classpath>             <path ref/>             <pathelement location="${java.test.classes.dir}"/>         </classpath>         <containerset>             <jboss3x dir="/tools/servers/jboss-3.2.5"                      config="default"                      port="8080"/>         </containerset>         <test name="com.oreilly.jent.people.AllServerTests"/>     </cactus> </target> 

In the main <cactus> element, we specify the test application archive file to be deployed for testing using the earfile attribute. The cactus task also supports the other attributes used with the junit task to toggle summary output, forking options, and the like. Within the cactus task, we specify the classpath needed to run the tests. The cactus task needs this information in order to run the specified tests in an internal TestRunner. Notice that we do not have to provide a cactus.properties file in the classpath when using the cactus task. We are specifying the test application archive to run, and the cactus task can extract the mappings for the Cactus proxies from this. The base URL for the proxies will be constructed from the server parameters given in the server-specific child elements.

Next, we specify one or more application servers to be used to run the test application and the tests, using the <containerset> child element. Each server used to run the tests is specified as a child element of <containerset>. In this example, we've specified a single JBoss 3.x server, installed in the /tools/servers/jboss-3.2.5 directory. We're using the default server from this JBoss installation, running on port 8080. If you are using a server not directly supported by the cactus task with its own child element, you can use the <generic> element within the <containerset>. This element lets you specify other Ant targets that should be used to start and stop the application server you're using as well as the HTTP port for the server's web container. You can then implement the start and stop targets in Ant to suit the particular application server in use.

18.5.3.2. Running Cactus tests using ServletTestRunner

Possibly the simplest way to run your Cactus tests is to use the ServletTestRunner provided with Cactus. This TestRunner is actually a servlet that you can invoke from any web browser. You deploy the ServletTestRunner servlet with your test application, mapping it to a particular URL in your application's web context by adding entries like the following to your test application's web.xml file:

 <servlet>     <servlet-name>ServletTestRunner</servlet-name>     <servlet-class>         org.apache.cactus.server.runner.ServletTestRunner     </servlet-class> </servlet>   . . .   <servlet-mapping>     <servlet-name>ServletTestRunner</servlet-name>     <url-pattern>/ServletTestRunner</url-pattern> </servlet-mapping> 

This servlet requires the Cactus libraries in order to run, but these will already be included in your test application to support your Cactus tests.

Once the ServletTestRunner is deployed, you can invoke specific TestCase or TestSuite classes using the suite URL argument to the servlet. Assuming that our server is running locally on port 8080, that the context root for the test application is jent-peopleFinder-test, and that the mapping shown here is used for the ServletTestRunner, then to run our TestPeopleFinderServlet test case, we'd go to our browser and navigate to the URL:

http://localhost:8080/jent-peopleFinder-test/ServletTestRunner?
suite=com.oreilly.jent.people.servlet.TestPeopleFinderServlet

If all goes well, the results will be returned to the browser as an XML file, as shown in Figure 18-3.

Figure 18-3. Results of TestPeopleFinderServlet


This output format is pretty raw, useful only in browsers that render XML directly (like Mozilla Firefox or Internet Explorer) or when you want to take the XML output and postprocess it yourself to generate a report. If you'd prefer to have the results provided in a different output format, the ServletTestRunner allows you to specify an XSL transform file to process the output of the test run. Just include the XSL transform file in your test application's web archive and tell the ServletTestRunner where to find it by setting a servlet init parameter named xsl-stylesheet. Cactus provides a default stylesheet called cactus-report.xsl that transforms the XML output into HTML. If we put this file in the styles directory of our web archive, we could optionally apply it to the test output by adding the <init-param> element to the servlet web.xml entry as shown here:

 <servlet>     <servlet-name>ServletTestRunner</servlet-name>     <servlet-class>         org.apache.cactus.server.runner.ServletTestRunner     </servlet-class>     <init-param>         <param-name>xsl-stylesheet</param-name>         <param-value>styles/cactus-report.xsl</param-value>     </init-param> </servlet> 

We can then choose to apply this transform to the test output by including a "transform" URL argument (with any value) to the URL used to invoke the ServletTestRunner. The result will look like Figure 18-4.

Figure 18-4. Transforming output from TestPeopleFinderServlet with XSL




Java Enterprise in a Nutshell
Java Enterprise in a Nutshell (In a Nutshell (OReilly))
ISBN: 0596101422
EAN: 2147483647
Year: 2004
Pages: 269

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net