Test Case Planning Overview


So where exactly does test case planning fit into the grand scheme of testing? Figure 18.1 shows the relationships among the different types of test plans.

Figure 18.1. The different levels of test documents all interact and vary on whether their importance is the document itself or the process of creating it.


You're already familiar with the top, or project level, test plan and know that the process of creating it is more important than the resulting document. The next three levels, the test design specification, the test case specification, and the test procedure specification are described in detail in the following sections.

As you can see in Figure 18.1, moving further away from the top-level test plan puts less emphasis on the process of creation and more on the resulting written document. The reason is that these plans become useful on a daily, sometimes hourly, basis by the testers performing the testing. As you'll learn, at the lowest level they become step-by-step instructions for executing a test, making it key that they're clear, concise, and organizedhow they got that way isn't nearly as important.

The information presented in this chapter is adapted from the IEEE Std 829-1998 Standard for Software Test Documentation (available from standards.ieee.org). This standard is what many testing teams have adopted as their test planning documentationintentional or notbecause it represents a logical and commonsense method for test planning. The important thing to realize about this standard is that unless you're bound to follow it to the letter because of the type of software you're testing or by your corporate or industry policy, you should use it as a guideline and not a standard. The information it contains and approaches it recommends are as valid today as they were when the standard was written in 1983. But, what used to work best as a written document is often better and more efficiently presented today as a spreadsheet or a database. You'll see an example of this later in the chapter.

The bottom line is that you and your test team should create test plans that cover the information outlined in IEEE 829. If paper printouts work best (which would be hard to believe), by all means use them. If, however, you think a central database is more efficient and your team has the time and budget to develop or buy one, you should go with that approach. Ultimately it doesn't matter. What does matter is that when you've completed your work, you've met the four goals of test case planning: organization, repeatability, tracking, and proof.

TIP

There are many good templates available on the Web for test plans that are based on IEEE 829. Just do a search for "Test Plan Template" to find them.


Test Design

The overall project test plan is written at a very high level. It breaks out the software into specific features and testable items and assigns them to individual testers, but it doesn't specify exactly how those features will be tested. There may be a general mention of using automation or black-box or white-box testing, but the test plan doesn't get into the details of exactly where and how they will be used. This next level of detail that defines the testing approach for individual software features is the test design specification.

IEEE 829 states that the test design specification "refines the test approach [defined in the test plan] and identifies the features to be covered by the design and its associated tests. It also identifies the test cases and test procedures, if any, required to accomplish the testing and specifies the feature pass/fail criteria."

The purpose of the test design spec is to organize and describe the testing that needs to be performed on a specific feature. It doesn't, however, give the detailed cases or the steps to execute to perform the testing. The following topics, adapted from the IEEE 829 standard, address this purpose and should be part of the test design specs that you create:

  • Identifiers. A unique identifier that can be used to reference and locate the test design spec. The spec should also reference the overall test plan and contain pointers to any other plans or specs that it references.

  • Features to be tested. A description of the software feature covered by the test design specfor example, "the addition function of Calculator," "font size selection and display in WordPad," and "video card configuration testing of QuickTime."

    This section should also identify features that may be indirectly tested as a side effect of testing the primary feature. For example, "Although not the target of this plan, the UI of the file open dialog box will be indirectly tested in the process of testing the load and save functionality."

    It should also list features that won't be tested, ones that may be misconstrued as being covered by this plan. For example, "Because testing Calculator's addition function will be performed with automation by sending keystrokes to the software, there will be no indirect testing of the onscreen UI. The UI testing is addressed in a separate test design planCalcUI12345."

  • Approach. A description of the general approach that will be used to test the features. It should expand on the approach, if any, listed in the test plan, describe the technique to be used, and explain how the results will be verified.

    For example, "A testing tool will be developed to sequentially load and save pre-built data files of various sizes. The number of data files, the sizes, and the data they contain will be determined through black-box techniques and supplemented with white-box examples from the programmer. A pass or fail will be determined by comparing the saved file bit-for-bit against the original using a file compare tool."

  • Test case identification. A high-level description and references to the specific test cases that will be used to check the feature. It should list the selected equivalence partitions and provide references to the test cases and test procedures used to run them. For example,

    Check the highest possible value

    Test Case ID# 15326

    Check the lowest possible value

    Test Case ID# 15327

    Check several interim powers of 2

    Test Case ID# 15328


    It's important that the actual test case values aren't defined in this section. For someone reviewing the test design spec for proper test coverage, a description of the equivalence partitions is much more useful than the specific values themselves.

  • Pass/fail criteria. Describes exactly what constitutes a pass and a fail of the tested feature. What is acceptable and what is not? This may be very simple and cleara pass is when all the test cases are run without finding a bug. It can also be fuzzya failure is when 10 percent or more of the test cases fail. There should be no doubt, though, what constitutes a pass or a fail of the feature.

YES, A CRASH IS A FAILURE

I was involved in a project that used an outsourced testing company for configuration testing of a multimedia program. They weren't the best choice but were the only ones available at the time. To make sure the job went smoothly, detailed test design specs, test case specs, and test procedures were submitted to the testing company so that there would be no question as to what would and wouldn't be tested.

Several weeks passed and the testing seemed to be going smoothlytoo smoothlywhen one day the lead tester on the project called. He reported on what his team had found for the week, which wasn't much, and just before hanging up asked if he should be reporting bugs on things that weren't listed in the documentation. When asked why, he said that since the first day they started testing his team had occasionally seen these big white boxes that said something about a "general protection fault." They would dismiss them but eventually their PC screens would turn bright blue with another cryptic serious failure error message and they would be forced to reboot their machines. Since that specific error wasn't listed as one of the fail criteria, he wasn't sure if it was important and thought he should check.

The moral of the story is that you can't assume that another tester will view the product the same way you will. Sometimes you may have to define the obvious in your test caseseven that a crash is unacceptable.


Test Cases

Chapters 4 through 7 described the fundamentals of software testingdissecting a specification, code, and software to derive the minimal amount of test cases that would effectively test the software. What wasn't discussed in those chapters is how to record and document the cases you create. If you've already started doing some software testing, you've likely experimented with different ideas and formats. This section on documenting test cases will give you a few more options to consider.

IEEE 829 states that the test case specification "documents the actual values used for input along with the anticipated outputs. A test case also identifies any constraints on the test procedure resulting from use of that specific test case."

Essentially, the details of a test case should explain exactly what values or conditions will be sent to the software and what result is expected. It can be referenced by one or more test design specs and may reference more than one test procedure. The IEEE 829 standard also lists some other important information that should be included:

  • Identifiers. A unique identifier is referenced by the test design specs and the test procedure specs.

  • Test item. This describes the detailed feature, code module, and so on that's being tested. It should be more specific than the features listed in the test design spec. If the test design spec said "the addition function of Calculator," the test case spec would say "upper limit overflow handling of addition calculations." It should also provide references to product specifications or other design docs from which the test case was based.

  • Input specification. This specification lists all the inputs or conditions given to the software to execute the test case. If you're testing Calculator, this may be as simple as 1+1. If you're testing cellular telephone switching software, there could be hundreds or thousands of input conditions. If you're testing a file-based product, it would be the name of the file and a description of its contents.

  • Output specification. This describes the result you expect from executing the test case. Did 1+1 equal 2? Were the thousands of output variables set correctly in the cell software? Did all the contents of the file load as expected?

  • Environmental needs. Environmental needs are the hardware, software, test tools, facilities, staff, and so on that are necessary to run the test case.

  • Special procedural requirements. This section describes anything unusual that must be done to perform the test. Testing WordPad probably doesn't need anything special, but testing nuclear power plant software might.

  • Intercase dependencies. Chapter 1, "Software Testing Background," included a description of a bug that caused NASA's Mars Polar Lander to crash on Mars. It's a perfect example of an undocumented intercase dependency. If a test case depends on another test case or might be affected by another, that information should go here.

Are you panicked yet? If you follow this suggested level of documentation to the letter, you could be writing at least a page of descriptive text for each test case you identify! Thousands of test cases could take thousands of pages of documentation. The project could be outdated by the time you finish writing.

This is another reason why you should take the IEEE 829 standard as a guideline and not follow it to the letterunless you have to. Many government projects and certain industries are required to document their test cases to this level, but in most other instances you can take some shortcuts.

Taking a shortcut doesn't mean dismissing or neglecting important informationit means figuring out a way to condense the information into a more efficient means of communicating it. For example, there's no reason that you're limited to presenting test cases in written paragraph form. Figure 18.2 shows an example of a printer compatibility table.

Figure 18.2. Test cases can be presented in the form of matrix or table.


Each line of the matrix is a specific test case and has its own identifier. All the other information that goes with a test casetest item, input spec, output spec, environmental needs, special requirements, and dependenciesare most likely common to all these cases and could be written once and attached to the table. Someone reviewing your test cases could quickly read that information and then review the table to check its coverage.

Other options for presenting test cases are simple lists, outlines, or even graphical diagrams such as state tables or data flow diagrams. Remember, you're trying to communicate your test cases to others and should use whichever method is most effective. Be creative, but stay true to the purpose of documenting your test cases.

Test Procedures

After you document the test designs and test cases, what remains are the procedures that need to be followed to execute the test cases. IEEE 829 states that the test procedure specification "identifies all the steps required to operate the system and exercise the specified test cases in order to implement the associated test design."

The test procedure or test script spec defines the step-by-step details of exactly how to perform the test cases. Here's the information that needs to be defined:

  • Identifier. A unique identifier that ties the test procedure to the associated test cases and test design.

  • Purpose. The purpose of the procedure and reference to the test cases that it will execute.

  • Special requirements. Other procedures, special testing skills, or special equipment needed to run the procedure.

  • Procedure steps. Detailed description of how the tests are to be run:

    • Log. Tells how and by what method the results and observations will be recorded.

    • Setup. Explains how to prepare for the test.

    • Start. Explains the steps used to start the test.

    • Procedure. Describes the steps used to run the tests.

    • Measure. Describes how the results are to be determinedfor example, with a stopwatch or visual determination.

    • Shut down. Explains the steps for suspending the test for unexpected reasons.

    • Restart. Tells the tester how to pick up the test at a certain point if there's a failure or after shutting down.

    • Stop. Describes the steps for an orderly halt to the test.

    • Wrap up. Explains how to restore the environment to its pre-test condition.

    • Contingencies. Explains what to do if things don't go as planned.

It's not sufficient for a test procedure to just say, "Try all the following test cases and report back on what you see…." That would be simple and easy but wouldn't tell a new tester anything about how to perform the testing. It wouldn't be repeatable and there'd be no way to prove what steps were executed. Using a detailed procedure makes known exactly what will be tested and how. Figure 18.3 shows an excerpt from a fictional example of a test procedure for Windows Calculator.

Figure 18.3. A fictional example of a test procedure shows how much detail can be involved.


Detail Versus Reality

An old saying, "Do everything in moderation," applies perfectly well to test case planning. Remember the four goals: organization, repeatability, tracking, and proof. As a software tester developing test cases, you need to work toward these goalsbut their level is determined by your industry, your company, your project, and your team. It's unlikely that you'll need to document your test cases down to the greatest level of detail and, hopefully, you won't be working on an ad hoc seat-of-your-pants project where you don't need to document anything at all. Odds are, your work will lie somewhere in between.

The trick is finding the right level of moderation. Consider the test procedure shown in Figure 18.3 that requires Windows 98 to be installed on a PC to run the tests. The procedure states in its setup section that Windows 98 is requiredbut it doesn't state what specific version of Windows 98. What happens with Windows 98 SE or with the various service pack updates? Does the test procedure need to be updated to reflect the change? To avoid this problem, the version could be omitted and replaced with "latest available," but then what happens if a new release comes out during the product cycle? Should the tester switch OS releases in the middle of the project?

Another issue is that the procedure tells the tester to simply install a "clean copy" of Win98. What does clean copy mean? The procedure lists a couple of tools, WipeDisk and Clone, to be used in the setup process and refers the tester to a document that explains how to use them. Should the procedure steps be more detailed and explain exactly where to obtain this other document and these tools? If you've ever installed an operating system, you know it's a complex process that requires the installer to answer many questions and decide on many options. Should this procedure or a related procedure go into that level of detail? If it doesn't, how can it be known what configuration the tests were run on? If it does, and the installation process changes, there could be hundreds of test procedures to update. What a mess.

Unfortunately, there is no single, right answer. Highly detailed test case specs reduce ambiguity, make tests perfectly repeatable, and allow inexperienced testers to execute tests exactly as they were intended. On the other hand, writing test case specs to this level takes considerably more time and effort, can make updates difficult, and, because of all the details, bog down the test effort, causing it to take much longer to run.

When you start writing test cases, your best bet is to adopt the standards of the project you're working on. If you're testing a new medical device, your procedures will most likely need to be much more detailed than if you're testing a video game. If you're involved in setting up or recommending how the test design, test cases, and test procedures will be written for a new project, review the formats defined by the IEEE 829 standard, try some examples, and see what works best for you, your team, and your project.



    Software Testing
    Lessons Learned in Software Testing
    ISBN: 0471081124
    EAN: 2147483647
    Year: 2005
    Pages: 233

    flylib.com © 2008-2017.
    If you may any questions please contact us: flylib@qtcs.net