One Team s Experience with Direct-Call Test Automation


One Team's Experience with Direct-Call Test Automation

Now that you have an idea of the problems with calling your code directly from the test and what's involved in doing it a different way, consider the following successful experience of one XP team using direct calls:

A small team at BoldTech Systems set about creating a generic Web-based content management system from scratch, using Java. The team started with five members, all experienced programmers and architects. They decided to automate most of the tests using Java and the JUnit test framework one layer beneath the user interface. These automated acceptance tests were developed right along with the code, with the acceptance tests written, automated, and run before the end of each iteration.

The user interface was made up of Java server pages (JSPs). The bulk of the application logic was put into a lower layer, Struts, a Web application navigation framework. Struts managed the form, the actions, the information that goes in and out of the session, and the errors. The automated acceptance tests used mock objects for forms, requests, responses, and sessions.

When a user pushes a button such as Submit, the Struts form calls a "button push" action, and the test methods emulate this with a mock object and the action. The action is where the business logic is located, and at that point the system code is called. The tests could verify that a Save button led to the correct validating and persisting data in the database and could emulate series of button pushes and verify that data persists between sessions.

This team designed their code specifically for ease of testing. By having well-formed code, so that all requests were processed in a central location, the amount of code missed by calling the system code directly was kept to a minimum. This was a great example of writing the system for testability.

For validation, the tests go into the session and validate the application state what is actually in memory for that user session rather than looking at an HTML table in the HTML user interface, for example. They validate how the data exists in the native structure before it's been rendered. Because the tests are written in Java, using the JUnit test framework, they're flexible and easy for the programmers to maintain. It's easy to drive the tests with data and create a variety of scenarios, including error testing. The tests can simulate multiple concurrent users as well. For example, they can write a test where two users try to delete the same record.

By testing below the user interface, the team finds the tests cost little to maintain. About 20% of the development effort is spent on creating and maintaining the automated acceptance tests, and most of that is on test creation. These tests cover about 80% of the application's functionality.

Team members report that the automated acceptance tests routinely find problems that the unit tests don't. They depend on these tests for confidence that their changes didn't break anything. They take about ten minutes to run, so they aren't run quite as often as the unit tests, but the programmers run them several times each day.



Testing Extreme Programming
Testing Extreme Programming
ISBN: 0321113551
EAN: 2147483647
Year: 2005
Pages: 238

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net