"Planning is a process that should build upon itself – each step should create a new understanding of the situation which becomes the point of departure for new plans."
— Planning, MCDP 5
U.S. Marine Corps
Below is a sample master test plan that was created to test the STQE.net Web site, which later became known as StickyMinds.com.
STQE.net Master Test Plan, Release 1
Version 1.5
STQE.net MTP 1.5
The following documents have been used in the preparation of this document:
Software Quality Engineering has contracted with an outside software development vendor to create a World Wide Web (WWW) site to function as a knowledge and information sharing site for software testing and quality engineering professionals. The target audience will be the same as the Software Testing and Quality Engineering magazine, software managers (development, testing, and quality) and test professionals, and software engineers who are interested in building and delivering better software.
Unlike many WWW sites, this site, to be known as SQE.net, is a software-driven database application using Microsoft Site Builder with ASP coding and the MS-SQL database. This Master Test Plan (MTP) covers the testing activities of the software and does not cover the initial or on-going tasks of adding, editing, publishing, and verifying the content.
The SQE.net site will be introduced in releases with each release having increasing functionality:
Future enhancements, such as job postings, banner ad management, a "What's new" feature, and a comprehensive site search engine, will take place in subsequent releases. This master test plan, covering the testing for Release 1 includes the following testing levels:
The philosophy of the testing is risk-based testing. All test objectives and tests will be prioritized for each level of testing as critical, high, medium, or low priority.
The software items to be tested include the following:
As this is the initial release of the STQE.net, testing will be required to verify all requirements of the site. Software risk issues are identified and prioritized in the STQE.net Test Objectives spreadsheet that is included in Appendix A of this plan.
Test objectives are listed as requirements-based, design-based, or code-based and further separated into groups:
Requirements Based
RB-FN |
Features - Navigation Bar |
RB-FH |
Features - Home |
RB-UM |
Features - User Member Management (Join, Sign-In, Update Profile) |
RB-FI |
Features - Interest Areas |
RB-FB |
Features - Books |
RB-FT |
Features - Tools and Services |
RB-FC |
Features - Calendar of Events |
RB-FD |
Features - Disclosures and Terms |
RB-FS |
Features - Sponsors and Advertisers |
RB-FA |
Features - Administrators |
RB-SG |
Scenarios - Guests |
RB-SU |
Scenarios - User Members (Logged-In) |
RB-SM |
Scenarios - Moderators |
RB-SP |
Scenarios - Providers (Vendors) |
RB-SA |
Scenarios - Administrator |
Design Based
DB-US |
Usage |
DB-SC |
Security |
DB-ML |
Multi-Language |
DB-PF |
Performance (Volume and Stress) |
DB-BC |
Browser Configurations |
DB-SR |
Site Failure/Restart |
DB-BK |
Backup/Recovery |
Code Based
CB-LK |
Links |
CB-HS |
Syntax (HTML and ASP Code) |
CB-TG |
Metatags and Graphics Tags |
We expect to test all of the objectives in the Test Objectives Inventory (Appendix A). However, if time does not permit, some of the low-priority items may be dropped.
The testing will be done manually until the site is sufficiently stable to begin developing automatic tests. The testing will cover the requirements for all of the different roles participating in the site: guests, members, vendors, moderators, and administrators.
Automated Testing Tools
We are going to implement automated testing using commercially available, off-the-shelf tools. A tool will be used for feedback/defect tracking. A tool will be implemented for test scripting using a mix of manual and automated tests. Capture/playback tools will be used on a limited basis for automating parts of the smoke test. Other utilities, such as link testers and HTML syntax checkers, will be used as needed. We do not plan any automated performance testing for Release 1.
The smoke tests will be the first series of tests to automate. Work will begin when the GUI interface and database are stable.
Defect Tracking
Testing issues and feedback from beta users will be reported on the STQE.net Issue Form and entered into a tool. Within one business day, we will analyze and classify any new issue as a software defect, enhancement, could not reproduce, not a problem, or failure. Severity level and fix priority of software defects will be set. Issue classes, severity categories, and fix priorities are listed in Appendix B.
Change Management
When the vendor turns the software over to SQE for testing, all changes to the site software will come under change control. The project manager will approve all changes moved into the test environment. A change notice must define the modules being changed and the reason for the change, including all repaired defects. Except for critical fixes that are blocking current testing efforts, changes will be scheduled not to impact testing.
Except for emergency repairs, changes will not be moved into the live environment until the test manager approves a new version for production. After the software is moved to the live environment, testing will confirm that the software matches the configuration in test and perform a smoke test.
Test Cycles
Each time a new version is released to the test environment, the following process will be undertaken:
While Release 1 is in the "live beta" status, updates that "pass" a test cycle will be moved to the production host and made "live."
Metrics
Metrics will be kept for test effort, incidents, defects, and test cases executed for each test cycle.
The entrance criteria for each level of testing are defined in Appendix C. The exit criteria are the entrance criteria for the following test level. The Web site will not be opened for content providers when any critical defects exist in those functions involved with the addition of content.
Release 1 of the site will not be opened to the general public until all critical and high-severity defects have been resolved. The project manager will have the discretion to determine that some critical and high defects may be deferred, where the effects of their failures do not affect guests and members in the use of the site.
With each update from the vendor, a smoke test will be performed. If this test does not pass, further testing is halted until a version is delivered that will pass that test. Testing will resume when an update that can pass the smoke test has been delivered.
The following documents will be prepared:
The vendor will perform the unit and integration testing. The browsers and the operating systems are accepted, as is.
Testers will identify the browser used during all tests. Four Web sites will be used in this development process:
Note |
The development uses an Access database for SQL tables for faster development. All other sites use MS-SQL databases. |
A separate test site may be needed for automated testing.
The following roles are identified:
The test manager and test engineers should be familiar with the STEP methodology from having taken the SST course.
Role |
Candidate |
Timing |
---|---|---|
Project Manager |
Jennifer Brock |
All, Part-Time |
Test Manager |
John Lisle |
All, Part-Time |
Test Engineers |
Jennifer Brock, John Lisle, Paul Danon |
All, Part-Time |
PC / Network Support |
Jim Sowder |
All, Part-Time |
See Appendix D for the schedule to develop the test planning and design documents. The following table represents the plan for the expected test cycles.
Testing Cycle |
Event |
Who |
Milestone |
---|---|---|---|
Test Cycle 1 |
Start |
3/8/1999 |
|
Run Smoke Test |
JB, JD, WM |
3/8/1999 |
|
Complete System Test (except performance) |
JB, JD, WM |
3/12/1999 |
|
Complete Acceptance |
JB, JD, WM |
3/12/1999 |
|
Turnover |
Content Providers |
WM |
3/15/1999 |
Test Cycle 2 |
Start |
3/22/1999 |
|
Run Smoke Test |
JB, JD, WM |
3/22/1999 |
|
Complete Acceptance |
JB, JD, WM |
3/26/1999 |
|
Test Cycle 3 |
Start |
4/5/1999 |
|
Run Smoke Test |
JB, JD, WM |
4/5/1999 |
|
Complete Acceptance |
JB, JD, WM |
4/9/1999 |
|
Test Cycle 4 |
Start |
4/12/1999 |
|
Run Smoke Test |
JB, JD, WM |
4/12/1999 |
|
Complete Acceptance |
JB, JD, WM |
4/16/1999 |
|
Test Cycle 5 |
Start |
4/19/1999 |
|
Run Smoke Test |
JB, JD, WM |
4/19/1999 |
|
Complete System Test |
JB, JD, WM |
4/23/1999 |
|
Complete Acceptance |
JB, JD, WM |
4/23/1999 |
|
Turnover |
General Public |
WM |
5/3/1999 |
This plan needs to be approved by the project manager for the Web site and the SQE project sponsor.
Refer to electronic spreadsheet for Test Objectives Inventory.
Incident Classification |
Definition |
---|---|
Software Defect |
Clearly a defect in the software, maybe requirements based, code based, or design based. |
Enhancement |
An enhancement to existing application. It could be code related, data, or process. |
Could Not Reproduce |
Could not recreate situation; made several attempts before categorizing as such. |
Not a Problem |
Could reproduce and determined application, process, and data were intentionally designed to behave as they are. |
Failure – Environment |
Failure occurred and has been determined to be due to a problem with the environment. Same failure does not occur when the environment has been corrected. |
Failure – Testware Defect |
Failure occurred and determination made. Testware was incorrect. Testware needs to be corrected. |
Failure – Test Execution |
Failure occurred and determination made was related to improper execution of the test. |
Failure – Other |
Failure occurred and does not fit into above categories. |
Severity |
Definition |
---|---|
Low |
Minor flaw not affecting operation or understanding of feature. |
Medium |
Feature is usable, but some functionality is lost or user may misinterpret and use improperly. |
High |
Important functionality is lost or feature is not usable, but there is a work-around or feature is not critical to the operations. |
Critical |
Important feature is not usable. Emergency fix is authorized. |
Fix Priority |
Response |
---|---|
Low |
Correct in next scheduled enhancement release or update documentation and do not fix. |
Medium |
Fix after high-priority defects and enhancements. Document work-around or affect on users. |
High |
Fix within 72 working hours, stop work on enhancements, if necessary. |
Critical |
Fix ASAP, within 12 hours; overtime authorized; skip full acceptance testing, if necessary. Don't go home until fixed. |
Test Level |
Description |
---|---|
Unit Test |
Component/Module for unit test is 100% complete:
|
Functional Test |
Components/Modules for integration test are 100% complete:
|
System Test |
Components/Modules for System test are 100% complete:
|
Performance Test |
Components/Modules for Performance test are 100% complete:
|
Acceptance Test |
Components/Modules for Acceptance test are 100% complete:
|
Live Beta |
Components/Modules for Live Beta are 100% complete:
|
Deliverable |
Event |
Who |
Milestone |
---|---|---|---|
Master Test Plan |
First Draft |
JBL |
2/8/1999 |
Review |
JBL, JB, WM |
2/9/1999 |
|
Version 1.0 |
JBL |
2/12/1999 |
|
Review |
JBL, JB, WM, DG |
2/24/1999 |
|
Version 1.1 |
JBL |
3/1/1999 |
|
Test Objectives |
First Draft (Partial) |
JBL |
2/8/1999 |
Version 0.9 (Partial) |
JBL |
2/12/1999 |
|
Review |
JBL, JB, WM, DG |
2/24/1999 |
|
Version 1.0 |
JBL |
3/1/1999 |
|
Review |
JBL, JB, WM |
3/3/1999 |
|
Version 1.1 |
JBL |
3/8/1999 |
|
Web Control Structure |
First Draft (Partial) |
JBL |
2/11/1999 |
Review |
JBL, JB, WM, DG |
2/15/1999 |
|
Version 1.0 |
JBL |
3/1/1999 |
|
Review |
JBL, JB, WM |
3/3/1999 |
|
Version 1.1 |
JBL |
3/9/1999 |
|
Smoke Test Design |
First Draft |
JBL |
3/8/1999 |
Review |
JBL, JB, WM |
3/12/1999 |
|
Version 1.0 |
JBL |
3/15/1999 |
|
Review |
JBL, JB, WM |
3/18/1999 |
|
Version 1.1 |
JBL |
3/24/1999 |
|
System/Acceptance Test Design |
First Draft |
JBL |
3/12/1999 |
Review |
JBL, JB, WM |
3/15/1999 |
|
Version 1.0 |
JBL |
3/24/1999 |
|
Review |
JBL, JB, WM |
3/27/1999 |
|
Version 1.1 |
JBL |
3/31/1999 |
Preface