Activities of the Tester

Now that we understand the major artifacts involved in testing, let us review the principal activities that create, update, or use these artifacts. There are six main groups of testing activities in the RUP (see Figure 18.1):

  • Define test mission

  • Verify test approach

  • Validate build stability (smoke test)

  • Test and evaluate

  • Achieve acceptable mission

  • Improve test assets

Figure 18.1. The Overall Test Workflow. The high-level test activities captured in this figure are done at least once for each iteration. Also within the iteration, testing is done in an iterative fashion (as indicated by the recursive arrows).

graphics/18fig01.gif

These activities are done at least once per iteration. Their focus, however, will vary across the lifecycle.

  • In Inception, the execution of this workflow may be very sketchy. The mission for a new project that produces no build could be simply to "warm up," try some test ideas, put together the tools and test harnesses, and define with other developers some testability requirements. Some testing may be done just by a walkthrough of the design.

  • In Elaboration, you will primarily focus on testing the architecture and key architecturally significant requirements, many of them nonfunctional (performance, scalability, integration with other products), assessing whether the major architectural risks have been mitigated. You will also test the testing strategy and the test architecture (the tools you have put in place to support testing).

  • In Construction, the focus on the test mission will shift toward more functional testing, validating all functionality described in the use cases, leading to a meaningful first beta system. Nonfunctional testing, and in particular performance testing, will continue to monitor progress in performance tuning, or detect rapidly that an evolution of the system results in a loss of performance.

  • In Transition, your final rush to product release, the focus will be on overall quality, robustness, and usability and on achieving the expected level of quality to allow the release of the product.

In other words, the testing mission is perfectly aligned with the objectives of the phases and does not consist, as in traditional processes, of a long period of leisurely test planning and test development, followed by a frenetic and hurried test period.

We will now review the activities in more detail.

Define Test Mission

You will identify the appropriate focus of the test effort for the upcoming iteration and will gain agreement with stakeholders on the goals that will direct the test effort.

For each iteration, this work is focused mainly on

  • Identifying the objectives for the testing effort and deliverable artifacts.

  • Identifying a good resource utilization strategy.

  • Defining the appropriate scope and boundaries for the test effort.

  • Outlining the approach that will be used, including the tool automation.

  • Defining how progress will be monitored and assessed.

And we have seen how the mission may evolve throughout the lifecycle.

Verify Test Approach

You will demonstrate that the various test techniques and test tools outlined in the Test Approach will facilitate the required testing. You verify by demonstration that the approach will work, produces accurate results, and is appropriate for the available resources. The objective is to gain an understanding of the constraints and limitations of each tool and technique in the context of your specific project, and either to find an appropriate implementation solution for each technique or to find alternative techniques that can be implemented. This helps to mitigate the risk of discovering too late in the project lifecycle that the test approach is unworkable.

For each iteration, this work is focused mainly on

  • Verifying early that the intended Test Approach will work and that it produces results of value.

  • Establishing the basic infrastructure to enable and support the Test Approach.

  • Obtaining commitment from the development team to provide and support the required testability to achieve the Test Approach.

  • Identifying the scope, boundaries, limitations, and constraints of each tool and technique.

Validate Build Stability (Smoke Test)

You will first validate that the build is stable enough for detailed test and evaluation efforts to begin. This work is also referred to as a "smoke test," build verification test, build regression test, sanity check, or acceptance into testing. This work helps prevent the test resources from being wasted on a futile and fruitless testing effort.

For each build to be tested , this work is focused on

  • Making an assessment of the stability and testability of the build: Can you install it, load it, and start it?

  • Gaining an initial understanding ”or confirming the expectation ”of the development work delivered in the build: What was effectively integrated into this build?

  • Making a decision to accept the build as suitable for use ”guided by the evaluation mission ”in further testing, or to conduct further testing against a previous build. Again, not all builds are suitable for a test cycle, and there is no point wasting too much testing time and effort on an unsatisfactory build.

Test and Evaluate

This is testing per se. You must achieve appropriate breadth and depth of the test effort to enable a sufficient test evaluation, relative of course to the iteration's Evaluation Mission. Typically performed once per test cycle, after accepting a build, this work involves performing the core tactical work of the test and evaluation effort ”namely the implementation, execution, and evaluation of specific tests and the corresponding reporting of incidents that are encountered . Testing tools will help select the appropriate tests and execute them ”when they can be run automatically.

For each test cycle, this work focuses mainly on

  • Providing ongoing evaluation and assessment of the Target Test Items.

  • Recording the appropriate information necessary to diagnose and resolve any identified issues.

  • Achieving suitable breadth and depth in the test and evaluation work.

  • Providing feedback on the most likely areas of potential quality risk.

Achieve an Acceptable Mission

At the same time, you must deliver a useful evaluation result from the test effort to your stakeholders ”where useful evaluation results are assessed in terms of the Mission. In most cases that will mean focusing your efforts on helping the project team achieve the Iteration Plan objectives that apply to the current test cycle.

For each test cycle, this work focuses mainly on

  • Actively prioritizing the minimal set of necessary tests that must be conducted to achieve the Evaluation Mission.

  • Advocating the resolution of important issues that have a significant negative impact on the Evaluation Mission.

  • Advocating appropriate product quality.

  • Identifying regressions in quality introduced between test cycles.

  • Where appropriate, revising the Evaluation Mission in light of the evaluation findings so as to provide useful evaluation information to the project team.

Improve Test Assets

As always with any good process, at the end of a test cycle or at least at the end of the iteration, you must close the loop, provide some feedback on the process itself, and take advantage of the iterative nature of the lifecycle to maintain and improve your test assets. This is important especially if the intention is to reuse the assets developed in the current test cycle in subsequent test cycles or even in another project.

For each test cycle, this work focuses mainly on

  • Adding the minimal set of additional tests to validate the stability of subsequent builds.

  • Assembling Test Scripts into additional appropriate Test Suites.

  • Removing test assets that no longer serve a useful purpose or have become uneconomic to maintain.

  • Maintaining Test Environment Configurations and Test Data sets.

  • Exploring opportunities for reuse and productivity improvements.

  • Conducting general maintenance of and making improvements to the maintainability of test automation assets.

  • Documenting lessons learned ”both good and bad practices discovered during the test cycle. This should be done at least at the end of the iteration.

Other Related Activities

Do not restrict your RUP investigation to the discipline, though. Testers and quality engineers are likely to be involved in many other activities in the RUP. For example:

  • Defining the Quality Assurance Plan (part of the SDP).

  • Participating in various reviews of the requirements and of the design, from which they will glean test ideas.

  • Participating in the activities of the Change Control Board to review defects and decide their fate.



The Rational Unified Process Made Easy(c) A Practitioner's Guide to Rational Unified Process
Programming Microsoft Visual C++
ISBN: N/A
EAN: 2147483647
Year: 2005
Pages: 173

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net