4.10 Performance Tests


Like the mock object, unit testing for performance is its own significant topic. Software performance often is neglected at the unit testing level, and is only taken into consideration during functional testing. However, performance-oriented unit tests are powerful tools, especially for applications that require specific performance goals be met. It's been reported that Apple's Safari browser was developed in an environment that automatically ran performance tests on any code that was checked in. The code was rejected if it did not meet or exceed the speed standards of previous versions. Thus, the unit tests ensured that the code's performance is continuously improving.

When a piece of code has a performance problem, it is very useful to first write a test that reveals the problem. This performance test not only lets you know when the code has achieved the desired performance, but also acts as a " canary in the coal mine" that indicates if the performance degrades again.

Tools intended specifically for performance-oriented unit testing are available, such as JUnitPerf. However, it is not difficult to develop performance tests within any unit test framework. This section gives an example of a unit test that tests the speed of retrieving a Book from a Library .

The initial question when writing a performance test is this: what is the performance criterion that the test must meet to pass? Usually, this is expressed in terms of the amount of time that a certain action may take. If the action takes too long, the criterion has not been met, and the test fails.

The Library class developed so far has a very poorly performing algorithm to get a Book . It serially reads through the collection of Book s, doing string comparisons on each one until the desired Book is found. This awful lookup stratagem is ideal for demonstrating a performance test that fails initially, but succeeds after a little refactoring. Example 4-22 shows the unit test class LibraryPerfTest .

Example 4-22. Performance unit test LibraryPerfTest
 LibraryPerfTest.java import junit.framework.*; import java.util.*; public class  LibraryPerfTest  extends TestCase {    private Library library;    public void  setUp( )  {       library = new Library( );       for ( int i=0; i < 100000; i++ ) {          String title = "book" + i;          String author = "author" + i;          library.addBook(new Book( title, author ));       }    }    public void  testGetBookPerf( )  {       double maxTime = 100; // milliseconds       long startTime = System.currentTimeMillis( );       Book book = library.getBook( "book99999" );       long endTime = System.currentTimeMillis( );       long time = endTime-startTime;       assertTrue( time < maxTime );       assertEquals( "book99999", book.getTitle( ) );    } } 

LibraryPerfTest is implemented as a test fixture since it is likely that more performance tests will be implemented. The setUp() method adds 100,000 Book s to the Library . The test method testGetBookPerf( ) tests the amount of time it takes to look up a Book . It uses the method currentTimeMillis( ) to get the system time before and after the getBook( ) operation, calculates the elapsed time, and compares it to a performance criterion of 100 milliseconds (0.1 second). As a sanity check, it also asserts that the expected Book was found.

With the Vector -based implementation of Library , the unit test fails:

 > java -classpath ".;junit.jar" junit.textui.TestRunner LibraryPerfTest .F Time: 0.562 There was 1 failure: 1) testGetBookPerf(LibraryPerfTest)junit.framework.AssertionFailedError         at LibraryPerfTest.testGetBookPerf(LibraryPerfTest.java:23)         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)         at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)         at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) FAILURES!!! Tests run: 1,  Failures: 1,  Errors: 0 

Library can be refactored to use a Hashtable to store Book s. (The refactored Library code is given in the next section, "New Library and Book Code.") With this change, lookups by title are efficient, and the test passes :

 > java -classpath ".;junit.jar" junit.textui.TestRunner LibraryPerfTest . Time: 0.734 OK (1 test) 

The total test time has increased. This is because addBook() takes longer to add a Book with the Hashtable implementation.

The hardcoded time value of 100 milliseconds used in this example can produce different results when the test is run on faster or slower platforms. Even when run on the same platform, varying machine loads and process priorities mean that a performance test can succeed or fail on subsequent runs without any code changes. Accounting for such variations can present a challenge when designing performance tests. There are a number of techniques to deal with these problems. Consistently running performance tests on the same platform is helpful. Test timing can be based on the time required to run a reference operation rather than on a hardcoded time value, allowing for system performance variations. Timing multiple repetitions of an operation reduces the effect of transient glitches. Finally, performance tests can use order-of-magnitude timing ranges rather than exact minimum timings, so that code meeting general performance goals will pass.

Unit Test Frameworks
Unit Test Frameworks
ISBN: 0596006896
EAN: 2147483647
Year: 2006
Pages: 146
Authors: Paul Hamill

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net