Validation Criteria


As promised, here's an example of the xtrackif.testcase file, with the simplest possible validations defined for login, createUserId, and deleteUserId:

Listing 24.1 xtrackif.testcase
 XTrackIf   [   Validation <validationTDI=Vlogin&               text=XTrack Stories&>   Validation <validationTDI=VcreateUserId&               text=Record added.&>   Validation <validationTDI=VdeleteUserId&               text=Record deleted.&>   ] 

We've defined three validations here, one for each module. For the login module, the validation says, in essence, to look for the text XTrack Stories. When a login succeeds in XTrack, the user's browser is redirected to the list of stories, which contains this text. If it fails, the user remains on the login page.

In previous discussions, where we were making direct calls to the system code we were testing, the issue of validation didn't come up. The validation was there, but it happened inside the piece of code we were calling. The XTrack session.login method, for instance, returned true or false depending on whether the attempt passed or failed.

Now that we're interacting with the system through the user interface, we have to decide for ourselves whether an action was successful. In this case, it means examining the system's response to our login attempt to determine whether the attempt was successful. We'll have to do something similar for the attempts to create a user, delete a user id, and so on.

It's easy to get too detailed when specifying validations. The page resulting from a successful login has a lot more text than just XTrack Stories, as well as other kinds of things, like links, buttons, and other form elements. Wouldn't it be better to validate many, if not all, of these items? In fact, the MDS framework does provide primitives for validating the presence of links, forms, form variables, and tables, and a way to extend these to application-specific items. For an XP project, however, you probably don't want to take this route.

It's better to start with the simplest possible validation that distinguishes a success from a failure and enhance it only if it proves insufficient for some reason. Remember, we need only validate the minimum necessary to know if our story is correctly implemented. The more detailed and specific these validations get, the more difficult they are to maintain. This is why the MDS framework breaks them out of the code and into a separate file. The reason they tend to be so volatile is that they're completely dependent on the user interface, which is generally the most change-prone part of the system.



Testing Extreme Programming
Testing Extreme Programming
ISBN: 0321113551
EAN: 2147483647
Year: 2005
Pages: 238

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net