Generating Unit Tests

To have a comprehensive regression-testing suite in place requires writing a lot of test cases. Full-bore test-driven development requires a unit test for every class. Taken to these levels, an approach is required to prevent the creation of test cases from becoming a chore for the developer. Given the boilerplate nature of unit tests, they are ideal candidates for code generation techniques.

Generating Unit Tests with Eclipse

In addition to running unit tests, Eclipse makes it easy to generate a unit test for any given class. A wizard is used to lay down the code, and it inspects the methods on the class under test to determine which test methods are to be included in the test case.

The Eclipse JUnit test wizard is shown in Figure 14-2.

Figure 14-2. Eclipse JUnit test case wizard.

The wizard offers several options for configuring the JUnit test generated. It is possible to specify which methods from TestCase should be overridden, define the class under test, and nominate the package for the test. When defining the package, it is considered best practice to use the same package as the class under test. With this approach, the relationship between test and implementation is unambiguous.

Note the warning from Eclipse at the top of the wizard dialog about generating a test case for an interface. Ideally, we want a test case for each class. An interface can be implemented by any number of classes. Nevertheless, this approach does make sense if we are testing a session bean via its remote interface from outside of the EJB container. Methods of the bean itself, such as setSessionContext(), cannot be called by the test case, only by the container. It is therefore nonsensical to generate a test case based on the class, because we cannot test these container-only methods.


It is a good idea to place all test cases in the same package as the class under test but in their own source directory. This makes packaging easier and avoids inadvertently deploying test cases in J2EE modules.

This tactic also provides the testing class with access to the package-level members of the class under test.

The final dialog for the wizard is shown in Figure 14-3. This screen allows the exposed methods on the class or interface to be selected for inclusion in the test case. Based on the methods selected, Eclipse creates the method stubs for writing the test, leaving the job of providing the test detail to the developer.

Figure 14-3. Eclipse test method selection dialog.

Generating tests in this manner may seem contrary to the principles of test-driven development. Previously, it was explained that a true test-driven approach to development involves writing the test case ahead of the class under test. How, then, can the Eclipse code generator be used for laying down the test case based on the class under test, when according to the principles of test-driven development, no actual class should exist?

There are perhaps two factors surrounding the rationale for including test code generators in IDEs. First, test case generators like the one offered by Eclipse are an indication that while developers are using JUnit for writing tests, in many cases engineers are falling back on the traditional approach of writing the test case after the class under test has been written. This test-second approach has value, but it does not emphasize the importance of the test case, and inadequate test cases may be the result.

The second factor is a question of whether we should be using test-driven development to drive the design or drive the implementation effort. There are many differing opinions on this topic.

My thoughts are that the best designs are produced by modeling, not by coding, particularly when extremely large systems are involved. Combining a model-based approach with a test-driven approach would see a minimal set of architecturally significant interfaces and classes, along with their responsibilities, defined before the main coding effort commences. With this approach, test-driven development is used to both ratify the interfaces being promoted and guide the writing of the code that sits behind each interface.

Chapter 5 covers the benefits of UML models in software design.


A model-based approach to design does not preclude the use of code-level refactoring to further advance and refine the design.

Using a combination of modeling and testing techniques to drive the development means all architecturally significant interfaces can be legitimately defined ahead of the test cases. Ideally, our modeling tool will have generated the structure of the application for us. This raises the question as to whether the modeling tool can also generate the test cases in tandem. This approach has implications if a Model-Driven Architecture (MDA) paradigm is adopted, a subject we cover in the next section.

Unit Tests and MDA

An MDA approach enables a large percentage of the code needed for an application to be generated from a high-level platform-independent model.

Model-Driven Architecture is covered in Chapter 8.

It is possible to leverage the power of MDA tools in order to support test-driven development during the construction, or implementation, phase of a project. This can be achieved by having the MDA tool generate all necessary test cases on our behalf.

This approach has significant advantages that go beyond the time-savings made by having the test cases autogenerated in an IDE. Many software projects fail to build up a comprehensive test suite for the application. Often, the result of unit test development efforts are a patchwork of tests spread unevenly across the application. Consequently, some areas of the system are overtested, [1] while other areas are completely overlooked.

[1] Some people would argue that you can never have enough testing.

This problem arises because the responsibility of producing unit tests traditionally is placed on the developer. Each developer is likely to take a different view of the value of unit tests. Some developers might diligently generate tests for all their work, while others may be far less rigorous in their approach to testing.

Standards that stress the importance of testing and provide guidelines to the necessary test coverage expected can help. Nevertheless, it still falls to the individual to follow the standards and work within the guidelines.

With MDA, it is possible to make the development of a consistent test suite much less of a hit-or-miss affair. The architect should be thinking strategically as to where test coverage is required for the design. Based on these decisions, an MDA transformation mapping can be used to generate test cases for the appropriate elements within the model.

Using the MDA tool in this manner ensures a consistent approach to test coverage is taken. Developers simply need to fill in the implementation detail for each generated test case.

Generating Test Cases with AndroMDA

The cartridge system of AndroMDA makes it easy to add the capability to the tool to generate test cases. Two options are available: either develop a new test cartridge or extend one of the existing cartridges by adding a new Velocity template.

Velocity templates and the Velocity template language are described in Chapter 6.

Updating one of the existing cartridges is perhaps the easiest method for getting something in place quickly, while building a new cartridge from scratch allows tests to be built for any project type by adding the test cartridge to the AndroMDA classpath. Note, however, that the latter option is a far more challenging and time-consuming, undertaking.

Regardless of which approach is taken, a Velocity template lies at the heart of the solution. To get you started, Listing 14-3 provides an example UnitTest.vsl template.

Listing 14-3. UnitTest.vsl AndroMDA Cartridge Velocity Template
 #set($packagename=$transform.findPackageName($class.package)) #set($remoteInterface=$str.lowerCaseFirstLetter(${})) package $packagename; import java.util.Hashtable; import javax.naming.InitialContext; import junit.framework.TestCase; public class ${}Test extends TestCase {   private ${} $remoteInterface = null;   /*    * Perform all set up work here    *    */   protected void setUp() throws Exception {     super.setUp();     Hashtable props = new Hashtable();     props.put(InitialContext.INITIAL_CONTEXT_FACTORY,        "weblogic.jndi.WLInitialContextFactory");     props        .put(InitialContext.PROVIDER_URL, "URL_AS_PROPERTY");     // Obtain remote interface to the session bean under test     //     $remoteInterface = ${}Util.getHome(props).create();   }   protected void tearDown() throws Exception {   } #foreach ($op in $class.operations)   #set($testMethod = $op.getName())   public final void test$testMethod() throws Exception {     // TODO Add your test here   } #end } 

In the configuration of the MDA cartridge, associate the template with all model elements with a stereotype of Service. The template applies to any cartridge that generates EJB components for all classes stereotyped as a Service.

Refer to Chapter 8 for information on the specifics of configuring AndroMDA cartridges.

The template generates a test case for each service element from the model. Tests take on the name of the service with Test appended to the name.

Walking through the example template, the setUp() method follows from the previous example and obtains the remote interface to the session bean generated by AndroMDA for the service. The remote interface is intended for use in each of the test methods.

A Velocity Template Language (VTL) #foreach statement is used to iterate through each operation on the model element. The name of the operation is obtained by interrogating the model. From the list of operations, the test methods for each business method on the service can be built up.

You can use the example shown in Listing 14-3 as a basis for further experimentation. AndroMDA makes all model metadata accessible from within the template, allowing the generation of sophisticated test cases. The trick is to avoid getting carried away and making the generated tests overly complicated. You'll find a little goes a long way.


Be devious and add a fail("Test not implemented") in every test method. This ensures tests fail unless the developer corrects the problem by supplying a valid test.

    Rapid J2EE Development. An Adaptive Foundation for Enterprise Applications
    Rapid J2EEв„ў Development: An Adaptive Foundation for Enterprise Applications
    ISBN: 0131472208
    EAN: 2147483647
    Year: 2005
    Pages: 159
    Authors: Alan Monnox © 2008-2017.
    If you may any questions please contact us: