CapturePlayback Test Automation


Capture/Playback Test Automation

Some automated test solutions involve special hardware, software, programming languages, or all three. These are not always a good fit for testers who may not have heavy hardware and/or software engineering backgrounds. These solutions also tend to be expensive because of their special-purpose nature. A preferable approach is one that is light on programming and keeps the tester focused on the art and science of testing.

Capture/playback test automation is like using a digital video recorder while you test.

Your inputs are recorded as you play the game and at various points you designate , screen images are captured for comparison whenever you "replay" the test in the future. This is well suited to the graphical nature of videogames .

Capture/playback testing adds a slight overhead to running the test manually the first time. At various points during your manual testing you will need to guide the test tool to capture the game data and screen elements that must be checked by the test. This investment pays off when you can automatically play back the tests every time after that. Maintaining the tests is expensive when the game features being tested are unstable. Time is wasted on implementing, reviewing, and managing changes that are only good for a few test runs. In that case, you should postpone automation until the requirements, design, and code are stable.

Vermont HighTest (VHT) is a capture/playback test product that was used for the examples in this chapter. The book's CD-ROM includes a link to the VHT Web site where you can download a 30-day demo of VHT as well as a User 's Guide. Each VHT "test" consists of database files containing data captured during recording and a document file containing the test code. Log files (documents) are created each time the test is run.

Recording

You should have some idea of what you are going to record and test for before you start automating. You can use tests that are produced by the various design methodologies described elsewhere in this book or test scripts that have been in your family for generations. In either case, you need to define or plan ahead of time which screen or window elements you are going to check (capture) in order to establish that the game is working right.

You start recording a test in VHT by selecting New from the Record menu or by pressing F11. During recording, a Record Bar window appears on your screen as shown in Figure 17.1. The icons in the first row of this window are used to capture various types of screen contents such as the entire screen, a specific window, or a region that is defined by the tester during recording. The icons in the second row are used to add comment or control statements to the test as it is being recorded. The stop sign icon in the third row is used to suspend recording.


Figure 17.1: Control panel with test recording controls.

Another way you can use the capture function is to record ad hoc and playability testing. The test can't detect problems it's not programmed to look for, but you can pause at any time to capture interesting windows or screen images to show to skeptical developers. The recording can be used to see what steps got you to the problem spot.

This is sort of like the airplane black box recorder that's always on, so that when an accident occurs analysts can work backward from the data to draw conclusions about what caused the accident .

Here is a breakdown of what a recorded test looks like:

 ;FileName: c:\documents and settings\administrator\desktop\vht test files\test1.inb ; ;Created: Tue Dec 21 15:51:46 2004 ;Author: ; ;Product: ;Abstract: ; 

This part is the header that is automatically added by VHT. It includes the filename and path of the test file and the creation date of the file. There are also placeholders to provide the author's name , a product identifier, and an abstract description of the test.

 ActivateWindow("Progman", "Program Manager", NULL, 10.0) ActivateWindow("SysListView32", "FolderView", 1, 10.0) ClickListView(1, "Dawn of War", DBLCLK, 0.0) ActivateWindow("Plat::Window {DB3DC0D7-BBA3-4d06-BCD8-40CD448B4AE3}", "Warhammer 40,000: Dawn of War", NULL, 10.0) 

Here the recording begins to capture your test actions. It shows that the test starts by launching the game window. As you can see from the data in the test commands, you are testing Dawn of War . The ClickListView command captures and plays back the act of double-clicking on the game's icon. It will find the game icon no matter where it is located on your desktop, even if it has moved since the time when you recorded the test.

 Keys("[Esc][Esc][Esc]", 0.0, 1.04, 1.07) 

This is a recording of the three times I hit the Esc key to skip over the introductory splash screens.

 MouseClick(540, 396, INTEL, LEFT, 19.59) 

This is the left mouse click to select the Army Painter from the main menu. The INTEL parameter specifies "Intelligent" mode (did you think it was some sort of advertising for a chip maker?). This will adjust the mouse coordinates for the current window.

 CompareScreen("Region 1") 

This line is added after you select the Compare Region icon from the Record Bar. It's the fourth one from the left on the first row of icons. The recording pauses while you drag a rectangle around the area of the screen you want to capture. VHT automatically names the region and inserts the CompareScreen command into the test file.

 MouseClick(216, 406, INTEL, LEFT, 0.0) MouseDoubleClick(216, 406, INTEL, LEFT, 0.5) MouseDoubleClick(216, 406, INTEL, LEFT, 0.78) MouseDoubleClick(216, 406, INTEL, LEFT, 0.56) MouseDoubleClick(216, 406, INTEL, LEFT, 0.51) MouseClick(216, 406, INTEL, LEFT, 0.46) 

These steps click an up arrow on the screen to increment the red color palette setting 10 times. Depending on how fast you click, VHT will record either a single MouseClick or a MouseDoubleClick . The two MouseClick commands plus the four MouseDoubleClick commands produces a total of 10 clicks.

 MouseDoubleClick(230, 420, INTEL, LEFT, 2.14) MouseDoubleClick(230, 420, INTEL, LEFT, 0.51) MouseDoubleClick(230, 420, INTEL, LEFT, 0.46) MouseDoubleClick(230, 420, INTEL, LEFT, 0.5) MouseDoubleClick(230, 420, INTEL, LEFT, 0.5) 

This series of five double-clicks decrements the green color palette setting by 10.

 MouseClick(81, 6, INTEL, LEFT, 5.12) CompareScreen("Region 2") 

Finally, these two lines move the cursor off of the down arrow and then compare the updated portion of the screen (captured using the Record Bar again) to the one that was originally recorded. The images captured for Region1 (left) and Region2 (right) are shown in Figure 17.2.


Figure 17.2: Region1 and Region2 images captured during recording.

Editing

VHT produces a test file during recording. It also inserts a header at the top of the test file that includes the filename of the test and the date and time it was captured. This header also has placeholders for you to enter "Author:", "Product:", and "Abstract:" information. You can edit this file using the tool's own editor or export it to a text file that can be imported back in after you are done with your editing.

Once the test is recorded, you can make changes to it. For example, you can add an inline comment to any of the commands in the file by entering a semicolon followed by your comment text. Here's a comment explaining the three Esc keys:

 Keys("[Esc][Esc][Esc]", 0.0, 1.04, 1.07) ;Skip splash screens 

You may also find that you want to delete commands from the test after you've already made a recording. You can, for example, remove an accidental key press or mouse click that is not meant to be part of the test. These can be removed by deleting the offending line from the test file. If you want to be more cautious, you can comment out any line by inserting a leading semicolon.

If you left something out and want to add it without going through the capture process all over again, the same commands available from the Record Bar can be typed directly into your test file, or altered to fit your needs. Appendix B of the Vermont HighTest User's Guide lists the available commands along with details of their syntax and parameters.

VHT also provides control and function statements. These can be used to create chunks of automation that you can add to your scripts without having to recapture those sequences. You can even use variables to alter the number of times you perform various operations during the test, such as multiple mouse button clicks. Details of these commands are provided in Appendix A of the VHT User's Guide.

If you are so inclined, you can create automated tests entirely from scratch with the commands available to you. If you are savvy with the automated command syntax, you could even automate the automation by using your tests to generate text files that can be imported and run by the automated test tool.

Test editing is also useful for automating a set of tests that only vary slightly from one another. Capture the "master" test normally, and then make copies of the recorded file. Edit the recordings as desired. This can save a lot of time versus trying to repeat the same capture process over and over again for each new test.

Playback

The Run (F12) option in the playback menu runs the test that is open in the VHT editor or prompts you to select a test file if none is open . Running the test produces a log file.

The log file consists of three parts : the header, the body, and the footer. Information within the body consists of your recorded test commands on the left side of the page and the results of each command to the right. The possible result values are Success or Failure . The following listing shows a successful playback ‚all of the operations return a Success result.

 ************************************************************ * FileName:         c:\documents and settings\administrator\desktop\vht test files\test1.inb * Log Created on:   Wed Dec 22 06:26:43 2004 * * Playback Options: *     Play Speed: 50 *     Terminate on Failure: * *     Ignore During Compare: ************************************************************ ActivateWindow > "Progman", "Program Manager", NULL, 10.0......Result: Success ActivateWindow > "SysListView32", "FolderView", 1, 10.0........Result: Success ClickListView > 1, "Dawn of War", DBLCLK, 0.0..................Result: Success ActivateWindow > "Plat::Window {DB3DC0D7-BBA3-4d06-BCD8-40CD448B4AE3}", "Warhammer 40,000: Dawn of War", NULL, 10.0...Result: Success Keys > "[Esc][Esc][Esc]", 0.0, 1.04, 1.07......................Result: Success MouseClick > 540, 396, INTEL, LEFT, 19.59......................Result: Success CompareScreen > "Region 1".....................................Result: Success MouseClick > 216, 406, INTEL, LEFT, 0.0........................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.5..................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.78.................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.56.................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.51.................Result: Success MouseClick > 216, 406, INTEL, LEFT, 0.46.......................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 2.14.................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.51.................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.46.................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.5..................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.5..................Result: Success MouseClick > 81, 6, INTEL, LEFT, 5.12..........................Result: Success CompareScreen > "Region 2".....................................Result: Success ************************************************************ * Log Ended on: Wed Dec 22 06:28:06 2004 ************************************************************ 

Not all test runs will be so clean. The playback is very unforgiving. Failures can occur due to minute problems in the position of an item on the screen or the slightest variation in any of its attributes, such as color, font, or value, for example.

Playback Options

Playback options are independent of the recorded test file. This allows you to " turn the knobs " for the tests you've already recorded without having to recapture the results.

Speed is an important playback option. In Vermont HighTest the playback speed ranges from 1 to 100, where a value of 50 represents the real-time speed of the test recording. Running the test back at a higher speed can improve your automated test throughput. However, running at too fast a rate can cause new test failures. You will have to analyze the results to distinguish between failures caused by improper test speed versus actual game code failures.

Another way to utilize the Speed value is to check the game against potential user speed "profiles." Do elements of the game time out too soon if the player takes a long time to respond to a dialog or complete a mission? Do game input buffers overflow if they are bombarded with rapid clicks, buttons , or key presses? These can be real defects in the game code.

Running the Dawn of War test at maximum speed does produce some failures. Look for the entries in the following log file that have a Failure result:

 ************************************************************ * FileName:         c:\documents and settings\administrator\desktop\vht test files\playspeedmax\test1.inb * Log Created on:   Tue Dec 21 19:17:02 2004 * * Playback Options: *     Play Speed: 100 *     Terminate on Failure: * *     Ignore During Compare: ************************************************************ ActivateWindow > "Progman", "Program Manager", NULL, 10.0......Result: Success ActivateWindow > "SysListView32", "FolderView", 1, 10.0........Result: Success ClickListView > 1, "Dawn of War", DBLCLK, 0.0..................Result: Success ActivateWindow > "Plat::Window {DB3DC0D7-BBA3-4d06-BCD8-40CD448B4AE3}", "Warhammer 40,000: Dawn of War", NULL, 10.0...Result: Success Keys > "[Esc][Esc][Esc]", 0.0, 1.04, 1.07......................Result: Success MouseClick > 540, 396, INTEL, LEFT, 19.59......................Result: Success CompareScreen > "Region 1".....................................Result: Failure MouseClick > 216, 406, INTEL, LEFT, 0.0........................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.5..................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.78.................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.56.................Result: Success MouseDoubleClick > 216, 406, INTEL, LEFT, 0.51.................Result: Success MouseClick > 216, 406, INTEL, LEFT, 0.46.......................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 2.14.................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.51.................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.46.................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.5..................Result: Success MouseDoubleClick > 230, 420, INTEL, LEFT, 0.5..................Result: Success MouseClick > 81, 6, INTEL, LEFT, 5.12..........................Result: Success CompareScreen > "Region 2".....................................Result: Failure ************************************************************ * Log Ended on: Tue Dec 21 19:17:32 2004 ************************************************************ 

The failures occurred in the two places where a portion of the Army Painter screen was checked by the CompareScreen function. One way to diagnose the cause of this type of problem is to run the automated test again and carefully watch what happens on the screen. In this case, all of the mouse clicks happened while the game was still loading, so they were ignored. After that the game just sat in the main menu, shown in Figure 17.3, because all of the recorded test steps had already been played back.


Figure 17.3: Dawn of War main menu screen with Army Painter highlighted.

You can also try to manually verify that there's no problem with the main menu timing by clicking the Army Painter choice as soon as the menu comes up on the screen. If that works correctly, you should be able to run the test faster than it was captured in real time. It's a matter of making the right changes to the test file. VHT provides a Delay function that you can add during capture by selecting it from the Record Bar. You can also edit the test file to insert this command. An important feature of this command is that it will delay for the specified number of second regardless of what the playback Speed is set to. For this case select a relatively long delay, such as 15 seconds, to account for target machines with slow CPUs or disk access times. Here's the portion of the test code with a commented Delay command inserted:

 ClickListView(1, "Dawn of War", DBLCLK, 0.0) ActivateWindow("Plat::Window {DB3DC0D7-BBA3-4d06-BCD8-40CD448B4AE3}", "Warhammer 40,000: Dawn of War", NULL, 10.0) Keys("[Esc][Esc][Esc]", 0.0, 1.04, 1.07) ;Skip splash screens Delay(15) ;Wait for game to load and bring up main menu MouseClick(540, 396, INTEL, LEFT, 2.43) CompareScreen("Region 1") 

Now the test passes when it runs at maximum speed.

Once one of the test steps fails it might be a waste of time for the test to continue running through the rest of its commands. This is especially true for a test file with hundreds or thousands of lines of code. You could also waste time looking through the log file pages to find the place where the test originally failed. You can save time by adjusting the VHT playback preferences to specify one or more categories of commands that will halt test execution immediately if they fail. Figure 17.4 shows the CompareScreen function added to the list of commands that will cause playback to Terminate on Failure.


Figure 17.4: Playback options set to Speed = 100 and Terminate on Failure of CompareScreen .

When the test fails on any of its Terminate on Failure operations, a special pop-up window appears and the test playback stops. Figure 17.5 shows the terminate pop-up you get when a CompareScreen operation fails.


Figure 17.5: Terminate on Failure pop-up window.

Once you specify which commands will cause Terminate on Failure, VHT will automatically list them in the log file header. You can also see in the following code how test logging and playback stopped once CompareScreen("Region1") failed.

 ************************************************************ * FileName:         c:\documents and settings\administrator\desktop\vht test files\toftest1.inb * Log Created on:   Tue Dec 21 19:40:56 2004 * * Playback Options: *     Play Speed: 100 *     Terminate on Failure: *           CompareScreen, *     Ignore During Compare: ************************************************************ ActivateWindow > "Progman", "Program Manager", NULL, 10.0......Result: Success ActivateWindow > "SysListView32", "FolderView", 1, 10.0........Result: Success ClickListView > 1, "Dawn of War", DBLCLK, 0.0..................Result: Success ActivateWindow > "Plat::Window {DB3DC0D7-BBA3-4d06-BCD8-40CD448B4AE3}", "Warhammer 40,000: Dawn of War", NULL, 10.0...Result: Success Keys > "[Esc][Esc][Esc]", 0.0, 1.04, 1.07......................Result: Success MouseClick > 540, 396, INTEL, LEFT, 19.59......................Result: Success CompareScreen > "Region 1".....................................Result: Failure 



Game Testing All in One
Game Testing All in One (Game Development Series)
ISBN: 1592003737
EAN: 2147483647
Year: 2005
Pages: 205

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net