The Preparation Phase

The first phase of a quality assessment is the preparation phase. One must first understand the development process and the key activities and milestones of the project schedule, then identify where the project is relative to the phase of that development process and the project schedule. Medium and large software projects normally have many components with varying schedules under the rubric of overall system schedules. The majority of them, however, should be in the same phase at a given time.

15.1.1 What Data Should I Look At?

For projects with an iterative development process, the macro level of product development would contain phases such as analysis, design, code, test, and customer validation although at the micro level, selected components may still be going through the iterative cycles near the back end of the project schedule.

For each macro phase of the development process and project schedule, there is a set of data, both quantitative and qualitative, that gauges the development progress, helps to surface problems, and can provide a predictive indication of final product quality. Previous chapters contain many examples of phase-specific metrics and data. In general, fewer data and metrics are available in the early phases of the development cycle. Those very early indicators are also less representative of final product quality than those at the back end of the development cycle. For example, the frequency of system crashes and hangs during the system test phase indicates better how the product will perform in the field than the number of defects found during unit testing. This does not mean quality assessments at the early cycles of the project are less important. One needs to make sure that the project is on track at every major phase in order to achieve the desirable final outcome. For example, positive indicators from the requirements and design phases mean that the stability and predictability of the back end of the development process will be better.

Suppose we are conducting a quality assessment for Project Checkpoint Review 1 (PR1) in the project depicted by Figure 15.1. The data to be gathered and assessed would be related to requirements and design such as progress of design complete, coverage and effectiveness indicators of design reviews, and so on. If one is conducting an assessment for PR2, then indicators pertaining to the status of coding activities and code integration into the system library, and builds and releases of drivers for testing will be pertinent. It is important to plan ahead the indicators, metrics, and information you intend to rely on for your assessment at various checkpoints. If you have a metrics program in place and have been tracking the necessary data on an ongoing basis, conducting quality assessments will be much easier. However, if you are starting from scratch at this point, don't despair. There are always data, information, and observations available that one can gather and analyze even when a metrics program is not in place. This is also a good time to start such a program and to demonstrate the value-added of the tracking system and the metrics.

15.1.2 Don't Overlook Qualitative Data

The preceding discussion implies quantitative data. Qualitative data is equally important, and at times even more so. We gather much of our qualitative data through one-on-one interviews or small group discussions. Information gathered via formal meetings such as the presentations from functional development teams are useful but usually need in-depth follow-up. We first determine who we want to talk to, then we prepare a list of what questions we want to ask. To determine the "who," think about the following:

  • Whose input is key at this stage?
  • Which people are the most knowledgeable about what's happening at this stage?
  • Am I including people from a variety of areas (developers, testers, support groups) to give me a balanced view?

To develop the list of questions, use both specific and open-ended questions. Open -ended questions are often the most useful. Here are some examples:

  • Where are we?
  • What's the outlook?
  • Where are the weak areas?
  • What are the risks?
  • Are there any mitigation plans? What are they?
  • How does this project compare to past projects in your assessment?

This last question helps to put the person's comments into perspective. Asking people to compare the current release to a specific past release puts all qualitative data into a similar frame of reference. During the preparation phase, we determine which past release or releases would be best for such comparison. For organizations without historical data for comparison or analysis of metric levels and trends, quality assessment planning may not be easy. For quality indicators that are well practiced in the industry (e.g., defect removal efficiency), targets can be based on industry benchmarks and best practices (Jones, 2000).

Figure 15.2 shows a list of quality indicators for quality assessment at various project checkpoints. The list includes both quantitative and qualitative indicators.

Figure 15.2. Quality Indicators by Checkpoint and Development Phase

graphics/15fig02.gif

What Is Software Quality?

Software Development Process Models

Fundamentals of Measurement Theory

Software Quality Metrics Overview

Applying the Seven Basic Quality Tools in Software Development

Defect Removal Effectiveness

The Rayleigh Model

Exponential Distribution and Reliability Growth Models

Quality Management Models

In-Process Metrics for Software Testing

Complexity Metrics and Models

Metrics and Lessons Learned for Object-Oriented Projects

Availability Metrics

Measuring and Analyzing Customer Satisfaction

Conducting In-Process Quality Assessments

Conducting Software Project Assessments

Dos and Donts of Software Process Improvement

Using Function Point Metrics to Measure Software Process Improvements

Concluding Remarks

A Project Assessment Questionnaire

show all menu



Metrics and Models in Software Quality Engineering
Metrics and Models in Software Quality Engineering (2nd Edition)
ISBN: 0201729156
EAN: 2147483647
Year: 2001
Pages: 176
Similar book on Amazon

Flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net