Roles and Artifacts

Four main roles are involved in the Test discipline:

  • The Test Manager has the overall responsibility for the test effort's success. The role involves quality and test advocacy , resource planning and management, and resolution of issues that impede the test effort.

  • The Test Analyst is responsible for identifying and defining the required tests, monitoring detailed testing progress and results in each test cycle, and evaluating the overall quality experienced as a result of testing activities. The role typically carries the responsibility for appropriately representing the needs of stakeholders that do not have direct or regular representation on the project.

  • The Test Designer is responsible for defining the test approach and ensuring its successful implementation. The role involves identifying the appropriate techniques, tools, and guidelines to implement the required tests and to give guidance on the corresponding resource requirements for the test effort.

  • The Tester is responsible for executing the system tests. This effort includes setting up and executing tests, evaluating test execution, recovering from errors, assessing the results of testing, and logging change requests .

When specific code (such as drivers or stubs) must be developed to support testing, the Designer and the Implementer are also involved in roles similar to the ones defined in Chapters 10 and 11.

Figure 12-1 shows the test artifacts and their relationships. Figure 12-2 shows the roles and artifacts in the Test discipline.

Figure 12-1. Artifacts and their relationships in the Test discipline

graphics/12fig01.gif

Figure 12-2. Roles and artifacts in the Test discipline

graphics/12fig02.gif

These are the key testing artifacts:

  • The Test Plan contains information about the purpose and goals of testing for the project within its schedule. The Test Plan identifies the strategies to be used, the resources necessary to implement and execute testing, and which Test Configurations will be required.

  • The Test Analyst maintains a Test Ideas List, an enumerated list of ideas, often partially formed , that identify potentially useful tests to conduct.

  • Some test ideas will evolve into full-fledged Test Cases, which specify a test, its conditions for execution, and the associated Test Data.

  • Test Scripts are manual or automated procedures used by the Tester to execute the tests.

  • Test Scripts may be assembled into Test Suites.

  • A Workload Analysis Model is a special kind of Test Case for performance testing; it identifies the variables and defines their values used in the different performance tests to simulate or emulate the actor characteristics and the end user 's business functions (use cases), their load, and their volume.

  • The Test Log is the raw data captured during the execution of Test Suites.

  • Test logs are filtered and analyzed to produce Test Results, from which change requests are raised, documenting defects or enhancement requests (see Chapter 13). Ultimately a Test Evaluation Summary is produced as part of the project iteration assessment and periodic status assessment (see Chapter 7, The Project Management Discipline).



The Rational Unified Process. An Introduction
Blogosphere: Best of Blogs
ISBN: B0072U14D8
EAN: 2147483647
Year: 2002
Pages: 193

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net