The first feature implemented provides a way to retrieve a recording s information via a Web service. Given this statement, the programmers wrote an implementation to satisfy this requirement. Now that the implementation is complete, we need a set of tests from the customer perspective to verify that the implementation does what the customer expects.
The first step is to develop a script to test the implementation manually. (I know we said that this was not desirable, but before we can automate something we have to know what it is.) The following are two samples of the test scripts:
Retrieve a recording with an id of 4 via the Web service.
Verify that the recording s title is The Rising .
Verify that the artist s name is Bruce Springsteen .
Verify that the release date is 7/30/2002 .
Verify that the recording s duration is 72:51 .
Verify that the label name is Sony .
Verify that the track 1 title is Lonesome Day .
Verify that the track 1 artist s name is Bruce Springsteen .
Verify that the track 1 genre is Rock .
Verify that the track 1 duration is 4:08 .
Steps 11 through 56 are not shown here. These steps verify the rest of the track information for the remaining tracks and are similar to steps 7 through 10.
Verify that the average recording s rating is 4 .
Verify that the review 1 reviewer name is Bob .
Verify that the review 1 review content is I thought it was great .
Verify that the review 1 rating is 5 .
There are additional steps for each review present, and they are similar to 58 through 60.
Retrieve a recording with an id of 100002 via the Web service.
This id is not currently assigned to any recording and should not be.
Verify that the recording was not found.
There are several other scripts written that are similar to the ones shown; these additional scripts verify information about other existing recordings in the database.
Because we have not implemented a UI, one way to run these scripts is to use the ASP.NET Web service infrastructure to exercise the Web service.
The customer will point the Internet Explorer browser to the URL of our Web service. Figure 7-1 shows a sample of the page that they see.
The customer enters the id of 4 and presses the Invoke button. The response that he receives is an XML document with the data about the recording returned by the Web service. A fragment of this XML document is shown here:
<?xml version="1.0" encoding="utf-8"?> <Recording xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http:// www.w3.org/2001/XMLSchema- instance" xmlns="http://nunit.org/webservices"> <id>4</id> <title>The Rising</title> <artistName>Bruce Springsteen</artistName> <releaseDate>7/30/2002</releaseDate> <labelName>Sony</labelName> <tracks> <id>45</id> <title>Lonesome Day</title> <artistName>Bruce Springsteen</artistName> <duration>248</duration> <genreName>Rock</genreName> </tracks> the rest of the information </Recording>
The customer has to manually verify each field in the XML. For example, for the recording s title, the user will locate the < title > < /title > tag of the < Recording > element and verify that the content of this tag is The Rising . (All you have to do is try the first script, which is longer than 60 steps, and you start to realize how long, tedious , and boring this is.) A single test takes several minutes and runs only when the customer has time to do it. Therefore, we should not expect the customer to be willing to perform these tests frequently and to do them when we need them done.
These test scripts are calling out for some form of automation. In fact, computers are very good at comparing two things, so let s automate these tests so they can be performed when they are needed and be less error-prone .