PROJECT CLOSURE ANALYSIS

12.1 PROJECT CLOSURE ANALYSIS

Project closure analysis is the key to learning from the past so as to provide future improvements. To achieve this goal, it must be done carefully in an atmosphere of safety so that lessons can be captured and used to improve the process and future projects. Before we describe the details of the closure analysis report, we briefly discuss the role of closure analysis and its implementation.

12.1.1 The Role of Closure Analysis

The objective of a postmortem or closure analysis is "to determine what went right, what went wrong, what worked, what did not, and how it could be made better the next time."2 Relevant information must be collected from the project, primarily for use by future projects. That is, the purpose of having an identified completion analysis activity, rather than simply saying, "The project is done," is not to help this project but rather to improve the organization by leveraging the lessons learned. This type of learning can be supported effectively by analysis of data from completed projects. This analysis is also needed to understand the performance of the process on this project, which in turn is needed to determine the process capability.

As noted earlier, the data obtained during the closure analysis are used to populate the process database (PDB). The data from the PDB can be used directly by subsequent projects for planning purposes. This information is also used in computing the process capability, which is used by projects in planning and for analyzing trends. Figure 12.1 illustrates the role of closure analysis.

Figure 12.1. The role of closure analysis

graphics/12fig01.gif

Earlier chapters discuss the types of data generally collected in a project and describe the collection methods. The amount of raw data collected in a project can be quite large. For example, a project involving five people and lasting for 25 weeks will have 125 entries for weekly effort, data for about 250 defects (assuming about 0.05 defects injected per person-hour), data on many change requests, various outputs, and so on. Clearly, these data will be of limited use unless they are analyzed and presented within a proper framework and at a suitable level of abstraction. Closure analysis aims to accomplish this goal.

After data analysis and extraction of all lessons learned from the analyses, the results should be packaged so that they can be used by others (packaging is the last step in the quality improvement paradigm6). Furthermore, to leverage this information, project processes must be constructed so that their execution requires the effective use of data. It can be argued, however, that even if others do not learn from the packaged information, the project personnel will have consolidated their experience and will carry the lessons learned from the analysis into future projects.2 In other words, a closure analysis is useful even if others do not directly gain from it.

12.1.2 Performing Closure Analysis

At Infosys, the project manager carries out the closure analysis with help from the SEPG quality adviser associated with the project. A template for the analysis report has been defined. The person carrying out the closure analysis must fill out this template properly, using mostly the metrics data, thereby keeping the focus on objective information.

As discussed earlier, the effort data are available from the weekly activity report database. The defect data can be gathered from the defect control system. Size data are obtained from the project. Planning data appear in the project management plan. These data constitute the main information needed for metrics analysis.

The data are first analyzed by the quality adviser, who develops an initial interpretation of the results. A meeting is then held among the quality adviser, the project leader, and other project members. The initial report serves as the basis of discussion, and further points and observations from the meeting are also noted. This meeting yields the basis of the final closure analysis report.

The final report is submitted to the SEPG and the business manager of the project and is shared among the project team members. The report is also entered in the PDB, making it available for future projects and analyses.

12.1.3 Closure Analysis Report

This section briefly discusses the major elements in an Infosys project closure analysis report; later, we present the closure report of the ACIC project. The contents of this analysis report form a superset of the data that are put in the PDB. The PDB contains only those metrics data that are needed often by projects and whose use is required by the current processes. The analysis report, however, may capture other data that might shed light on process performance or help to better explain the process.

General and Process-Related Information

The closure report first gives general information about the project, the overall productivity achieved and quality delivered, the process used and process deviations, the estimated and actual start and end dates, the tools used, and so on. This section might also include a brief description of the project's experience with tools (detailed "experience reports" are put into the BOK system). The information about tools can be used by other projects to decide whether use of the tool is warranted. It can also be examined to identify tools that have good advantages and to propagate their use throughout the rest of the organization.

Risk Management

The risk management section gives the risks initially anticipated for the project along with the risk mitigation steps planned. In addition, this section lists the top risks as viewed in the post-project analysis (they are the real risks for the project). This information can be used by later projects and can be used to update risk management guidelines. Notes may also be provided on the effectiveness of the mitigation steps employed.

Size

As discussed in Chapter 6, many projects use the bottom-up method for estimation. In this method, the size of the software is estimated in terms of the number of simple, medium, or complex modules. Hence, this size is captured along with the criteria used for classification (different projects may use different criteria). Data on both the estimated size and the actual size are included.

For normalization purposes, the productivity of a project is measured in terms of function points (FP) per person-month. Although FP can be counted by studying the functionality of the system, at closure time it is computed from the measured size in lines of code (LOC). If multiple languages are used, we simply add the sizes (in FP) of the modules in different languages. Strictly speaking, function points (unlike lines of code) are not an additive measure. Because we are measuring only the size of the complete system in FP, however, this approach is equivalent to converting all LOC counts into an LOC count of some "universal" language and then converting that size into FP. Furthermore, because of the inherent limitations of software metrics and their use, some inaccuracies are acceptable, provided that the methods are used consistently. The size in FP is also captured in the closure analysis report.

Effort

The closure analysis report also contains the total estimated effort and the actual effort in person-hours. The total estimated effort is obtained from the project management plan. The total actual effort is the sum of the total effort reported in all WARs submitted by the project members, including the project leader. If the deviation between the actual and the estimated values is large, reasons for this variation are recorded.

For each of the major steps in the process, the total actual effort and estimated effort for the stage are captured, too. This information can be useful in planning, and it is a key input in forming the PCB. For each stage, where possible, the effort is separated into the effort for the task, for the review, and for the rework. The WAR codes described earlier in the book permit this separation. The distribution of effort in the various phases can then be computed and recorded. The separation of effort between task, review, and rework aids in identifying the scope of productivity improvement.

The cost of quality for the project is also computed. It measures the cost of all activities that directly contributed to achieving quality. The cost of quality can be defined in many ways; here it is defined as the percentage of the total effort spent in review, testing, rework to remove defects, and project-specific training.

Defects

The defects section of the closure analysis report contains a summary of the defects found during the project. The defects can be analyzed with respect to severity (percentage of defects that were major, minor, or cosmetic), stage detected (percentage of total detected defects detected by which activity), stage injected (which activity introduced what percentage of total defects), and so on. Injection rate and defect distribution are also determined.

The defect removal efficiency of a defect removal task is defined as the percentage of total defects that existed at the time of execution of the task that are detected by the execution of the task. This metric is useful for determining which quality activities need improvement. The closure report gives the defect removal efficiency of the major quality control tasks, as well as the overall defect removal efficiency of the process. Other analyses of defect data may also be included. Sometimes, a separate analysis of the review data may be performed. The estimated versus actual defect levels are also analyzed.

Causal Analysis

When the project is finished, the performance of the overall process on this project is known. If the performance is outside the range given in the capability baseline, there is a good chance that the variability has an assignable cause. Causal analysis involves looking at large variations and then identifying their causes, generally through discussion and brainstorming.

Process Assets

In addition to the metrics data, other project artifacts are potentially useful for future projects. Chapter 2 discusses the use of process assets. These process assets are collected at project closure. The potential entries to the BOK are also identified during closure, although they are submitted later.

 



Software Project Management in Practice
Linux Annoyances for Geeks: Getting the Most Flexible System in the World Just the Way You Want It
ISBN: 0596008015
EAN: 2147483647
Year: 2005
Pages: 83
Authors: Michael Jang

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net