If You Have Trouble Getting Started


If you don't have a lot of experience with automating tests and the attempt to leap into coding acceptance tests leaves you with writer's block, here's an intermediate step to help you along. (This runs contrary to the XP philosophy of immediately writing test code, but there's no shame in the learning process.) Go back to the high-level tests we talked about in Chapter 9. Think about common user scenarios, sequences of interactions between a user and the system. Think about the actions, data and expected results of each test:

  • Action. What the user does for instance, "Login"

  • Data. Specific inputs used in the action for instance, "User id and password"

  • Expected results. Minimum success criteria: screens, transitions, messages, data persists as a result of an action or step for instance, "Invalid user ID or password"

If it helps you to think through the tests at first, lay these out in a format that's easy and clear for you (see Table 16.1). For example:

Scenario: Attempt to login leaving the user id and password blank it should fail.

Table 16.1. Simple scenario test

Action

Data

Expected Result

Login

user id=blank password=blank

"The user ID or password is invalid."

You can think of more complex scenarios whose steps you can translate into executable tests (see Table 16.2):

Scenario: Search with a misspelled category, get notice suggesting correct spelling; search again with correct spelling.

Table 16.2. More complex scenario test

Action

Data

Expected Result

1. Login

Valid user id and password

"Enter your search criteria and click the search button."

2. Search

Misspelled category

Suggestion of corrected spelling

3. Search

Suggested correct spelling

List of businesses

Whether you write the tests directly in executable format or use this method to help get you going, the tests will provide the signposts along the path of the team's XP journey. They need to be granular enough to show the project's true progress, so avoid unnecessary overlap. If you have a scenario with 300 steps and 10 of them fail, you probably have to fail the whole test case rather than come up with some complex formula for determining what percentage of it worked. Keep the tests simple.

You and the customer also have to come up with all the sets of data you want your tests to verify. Again it's best to code these directly into the tests, but if your customer has trouble conceptualizing that way or you just need a jump start, try any simple format to lay out the data you'll use to code the tests. Once you have more experience with automating, you can dispense with these extra steps. Table 16.3 shows a sample of test data you and the customer might have defined for a login scenario.

Table 16.3. Sample test data

Characteristics

User ID

Password

E-mail Address

Name

Expected Result

Invalid (missing)

(missing)

(missing)

   

"Please enter a valid user ID and password"

Invalid (missing)

jimbob

(missing)

   

"Please enter a valid user ID and password"

Invalid (bad id)

JIMBOB

Jumbo

   

"Please enter a valid user ID and password"

Invalid (bad psw)

jimbob

JUMBO

   

"Please enter a valid user ID and password"

Valid

jimbob

Jumbo

jim@azx.net

Jim Thornkj

"Welcome, Jim"

Valid

testuser1

testpsw1

test1@xptester.org

Test Userone

"Welcome, Test"

Valid

testuser2

testpsw2

test2@xptester.org

Test Usertwo

"Welcome, Test"

We'll talk more about these formats for documenting tests in the next chapter, in case your customer isn't comfortable with having acceptance tests documented only in the automated tests themselves.



Testing Extreme Programming
Testing Extreme Programming
ISBN: 0321113551
EAN: 2147483647
Year: 2005
Pages: 238

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net