Formal Test Plan


With the possible exceptions of unit testing and performance testing, ETL testing sessions are organized events that are guided and controlled by an agenda called a test plan. Each test plan should specify the information illustrated in Figure 11.6.

Figure 11.6. Test Plan Components

graphics/11fig06.gif

  • Purpose: Describe in general terms what is being tested . For example:

    "Data is extracted from the Customer Master file, Account Tran file, Sales History database, and Product Master database. The extracted data from Customer Master, Account Tran, and Sales History must be merged under Customer Number, and the extracted data from Product Master and Sales History must be merged under Product Number. The ETL programming specifications include 28 transformations and 46 cleansing algorithms. We will run 20 test cases to test the transformations and 42 test cases to test the cleansing algorithms."

  • Schedule: After reviewing the ETL process flow diagram and determining which programs have to run in which sequence and which ones can run in parallel, every program in the job stream must be scheduled to run in exactly that sequence on certain dates at certain times.

  • Test cases: The bulk of the test plan will be the list of test cases. It is important to have the business representative participate in writing the test cases. Each test case specifies the input criteria and the expected output results for each run. It also describes the program logic to be performed and how the resulting data should look. For example:

    "Submit module ETL3.3 using the T11Customer temporary VSAM file and the T11Product temporary VSAM file. Both temporary files are sorted in descending order. Module ETL3.3 merges the two files and rejects records that do not match on Sale-Tran-Cd and Cust-Num. All rejected records must trigger the error message: 'ETL3.3.e7 No match found on < print out Cust-Num> and < print out Prd-Nbr> < print out system date-time stamp>.'"

  • Test log: A detailed audit trail must be kept of all test runs, itemizing the date and time the programs ran, the program or program module numbers , who validated them, the expected test results, the actual test results, whether the test was accepted or not, and any additional comments. Table 11.1 shows a sample test log.

graphics/hand_icon.gif

Remember that all programs in the ETL process are tested and retested until the complete ETL process runs from beginning to end as expected.

Table 11.1. Example of a Test Log

Date

Time

Program Number

Tester

Expected Test Results

Actual Test Results

Accepted (Yes/No)

Comments

8/25/2003

8:45 A . M .

ETL1.1v1

BLB

232,489 sold loans

983,731 purchased

230,876 sold loans

983,689 purchased

Yes

There were 1,655 rejected loans. Rejections are valid and expected.

8/25/2003

8:45 A . M .

ETL3.1v1

JAL

$30,555,791.32 daily loan balance

$33,498,231.11 daily loan balance

No

Double-counting occurs on ARM IV loans funded after 10/01/2003.

8/25/2003

9:10 A . M .

ETL1.2v1

BLB

SRP-Code totals:

A 398,220

B 121,209

C 228,734

SRP-Code totals:

A 398,208

B 120,871

C 228,118

No

966 codes have invalid characters like @ : ^ -.



Business Intelligence Roadmap
Business Intelligence Roadmap: The Complete Project Lifecycle for Decision-Support Applications
ISBN: 0201784203
EAN: 2147483647
Year: 2003
Pages: 202

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net