Orthogonal Defect Classification

Orthogonal defect classification (ODC) is a method for in-process quality management based on defect cause analysis (Chillarege et al., 1992). Defect cause or defect type analysis by phase of development is not new. In many development organizations, metrics associated with defect cause are part of the in-process measurement system. The ODC method asserts that a set of mutually independent cause categories (orthogonal) can be developed, which can be used across phases of development and across products, and that the distribution of these defect types is associated with process phases. The authors contend that a more or less stable "signature profile" of defect type distribution can be established by each phase of the development process. By examining the distribution of defect types, therefore, one can tell which development phase the current project is at, logically. The authors propose eight defect types:

  • Function
  • Interface
  • Checking
  • Assignment
  • Timing/serialization
  • Build/package/merge
  • Documentation
  • Algorithm

The authors contend that functional defects (missing or incorrect functions) are associated with the design phase; interface defects are associated with low-level design; checking with low-level design or code implementation; assignment with code; timing/serialization with low-level design; build/package/merge with library tools; documentation defects with publications ; and algorithms with low-level design.

The authors offer several examples of ODC. One example illustrates the high percentage of the defect type "function" found at a late stage in the development cycle. Specifically, the defect discovery time was classified into four periods; the last period corresponded approximately to the system test phase. In the last period the number of defects found almost doubled , and the percentage of defect type "function" increased to almost 50%. Since the defect type "function" is supposed to be found earlier (during the design phase), the observed distribution indicated a clear departure from the expected process behavior. Given that function defects were the cause of the departure , the analysis also suggested an appropriate design reinspection rather than more intensive testing.

In addition to defect type analysis, the ODC method includes defect triggers to improve testing effectiveness. A defect trigger is a condition that allows a defect to surface. By capturing information on defect triggers during testing and for field defects reported by customers, the test team can improve its test planning and test cases to maximize defect discovery.

The trigger part of ODC and its application to testing appear to be more solid than the assertion with regard to the "signature profiles" of defect type. Whether the process associations with defect type can be applied across products or organization uniformly is an open question. Even assuming similar development processes, differences in process details and focus areas may lead to differences in the distribution of defect types and defect causes. For instance, in the example shown in Figure 9.19, final resolution of interface issues is one of the exit criteria of high-level design inspection (I0). Therefore, higher percentages of interface defects are observed at I0, instead of at low-level design (I1). Another variable is the maturity level of the development process, especially in terms of the error injection rate. A defect type distribution for a development organization with an error injection rate of 60 defects per KLOC is likely to be different from that with an error injection rate of 20 defects per KLOC. The actions for reducing error injection or defect prevention are likely to have stronger effects on some defect causes than on others.

With regard to use of a defect type distribution for assessing the progress of the project, the ODC method seems to be too indirect. The several quality management models and the many in-process metrics discussed in this book would be more effective for project and quality management. At the defect analysis level, a more direct approach is to use the defect found (at which phase of development) versus defect origin (or test origin) analysis ”see the examples in Figures 6.4 and 9.19.

The ODC method has evolved over the years . More defect attributes have been developed. The attributes classified by ODC when a defect is opened include the following:

  • Activity ” The specific activity that exposed the defect. For example, during system test, a defect occurs when one clicks a button to select a printer. The phase is system test but the activity is function test because the defect surfaced by performing a function test-type activity.
  • Trigger ” The environment or condition that had to exist for the defect to surface.
  • Impact ” This refers to the effect the defect had on the customer if it had escaped to the field, or the effect it would have had if not found during development.

The attributes classified by ODC when a defect fix is known include the following:

  • Target ” What is being fixed: design, code, documentation, and so forth?
  • Defect type ” The nature of the correction made
  • Defect qualifier (applies to defect type) ” Captures the element of nonexistent, wrong, or irrelevant implementation
  • Source ” The origin of the design/code that had the defect
  • Age ” The history of the design/code that had the defect

    The ODC defect analysis method has been applied to many projects and successful results have been reported (Bassin et al., 2002; Butcher et al., 2002). The most significant contribution of ODC seems to be in the area of providing data-based assessments leading to improvement of test effectiveness.

    Data and resources permitting, we recommend in-depth defect-cause and defect-type analysis be done (whether or not it is according to the ODC classifications) as an integrated part of the in-process metrics in the context of quality management models.

Recommendations for Small Organizations

The Rayleigh model for quality management is a useful framework. I recommend it to development organizations of all sizes. For organizations that don't have data and metrics tracking for all phases of development, simply focus on the strategies and actions along the two directions of improvement: (1) those that will lead to early defect removal and (2) those that will reduce the error injection by the development team. For data tracking and metrics for small organizations that plan to start a metrics practice, I recommend the following:

  • For the front end of the development process, use the inspection scoring checklist (Table 9.6).
  • For the middle of the development process, use the code integration pattern metric and over time establish a heuristic model.
  • For the back end of the development process, use a testing defect related metric or model (Figures 4.2, 9.10, 9.11, 9.15, or 10.5).

Implementation of the three metrics provides support to the Rayleigh model of quality management.

We have discussed the testing defect arrival metric and its variants and recommended it in several chapters. The only point to add is that for small projects, the time unit for this metric doesn't have to be "week." It can be in terms of days or hours of testing and it should be scaled according to the duration of testing and the volume of defect arrivals.

As discussed in section 4, the code integration pattern metric is a powerful project management and quality management tool. If can be implemented easily by small and large teams , with or without an established metrics tracking system.

The inspection scoring checklist is perfect for small teams. It is a simple and flexible tool that can be implemented with just paper (the form) and pencil by small organizations starting a metrics program that may not be ready to invest the resources to track and analyze the defects found by design reviews and code inspection. Even in organizations with established metrics practices, metrics tracking for requirements, design, and code is usually accorded less effort than tracking for testing and field quality. The inspection scoring checklist is a good tool for the team's self-improvement. At the same time, it provides two important pieces of information for project and quality management on the front-end implementation of the development process: the quality of the design or code and the effectiveness of the design reviews or code inspections. The checklist can be used with any forms of review or inspection ranging from formal inspection meetings to informal buddy reviews.

Accumulated data gathered via the checklist can be used to establish baselines and to indicate the process capability of the organization. Data from the current project can be compared to the baseline and used as an early predictive indicator. For example, if the average scores of the designs and the effectiveness of design reviews are substantially higher than the norm established by history, one can expect a lower testing defect rate and a lower field defect rate (because of better intrinsic design and/or code quality) even though test effectiveness remains unchanged.

What Is Software Quality?

Software Development Process Models

Fundamentals of Measurement Theory

Software Quality Metrics Overview

Applying the Seven Basic Quality Tools in Software Development

Defect Removal Effectiveness

The Rayleigh Model

Exponential Distribution and Reliability Growth Models

Quality Management Models

In-Process Metrics for Software Testing

Complexity Metrics and Models

Metrics and Lessons Learned for Object-Oriented Projects

Availability Metrics

Measuring and Analyzing Customer Satisfaction

Conducting In-Process Quality Assessments

Conducting Software Project Assessments

Dos and Donts of Software Process Improvement

Using Function Point Metrics to Measure Software Process Improvements

Concluding Remarks

A Project Assessment Questionnaire



Metrics and Models in Software Quality Engineering
Metrics and Models in Software Quality Engineering (2nd Edition)
ISBN: 0201729156
EAN: 2147483647
Year: 2001
Pages: 176

Flylib.com © 2008-2020.
If you may any questions please contact us: flylib@qtcs.net