THE ACIC CLOSURE ANALYSIS REPORT

12.2 THE ACIC CLOSURE ANALYSIS REPORT

This section presents the closure analysis report of the ACIC project. First, the report gives some general information about the project. The performance summary that follows shows that the project had an effort overrun of about 19% caused by two major change requests. It also gives the planned versus actual data for team size, start and end dates, quality, productivity, cost of quality, defect injection rate, and defect removal efficiency. In almost all these parameters, the actual performance was very close to the estimated. The actual defect injection rate is about 26% lower than estimated, largely because of the defect prevention activities.

The report gives an overview of the process tailoring done in the project and specifies the internal and external tools that were used. For risk management, the report discusses the risks that were originally identified as well as the real risks that the project leader and SEPG feel existed for the project. As you can see, these are not the same; a new risk conversion to VAJ 3.0 arose during the project. The notes on risk mitigation state that this risk was effectively managed by showing the impact of the change to the customer and then agreeing to postpone this conversion to a future version. For other risks, the notes assess the effectiveness of the risk mitigation strategies.

The report records the estimated and actual size in terms of the number of programs of different complexity. The size of the final output system is also given in LOC, along with the language. For this project, the size was about 33 KLOC of Java code, which translates to about 1612 FP, and about 1K of COBOL code, which translates to about 12FP. This size figure was used to compute the project's productivity and quality.

Next, the estimated and actual schedules for various phases are given. Where the deviation is large (for example, in acceptance testing), the report gives a reason for the slippage.

Next are shown the data on effort. First, the report gives the distribution of actual effort over the various project phases, along with the task, review, and rework effort for each phase. Using this breakdown, the cost of quality has been computed at 31.4% for this project. Then the estimated and actual effort for the various stages of the project are given, along with the reason for deviation, where the deviation is large. As you can see, in this project the overall deviation is not very large, although for a few phases the deviation is substantial and is sometimes negative.

Next, the report contains an analysis of the defects. The distribution of defects among the various defect detection stages is given, along with the estimates for those stages. As with other parameters, the percent deviation is also shown. Here, too, the overall the deviation is 20%, although there is significant deviation for some phases. The report states the reason for the overall reduction and for the significant deviation in the distribution of actual defects. Then the defect data are given by stage detected and stage injected. Using these data, defect removal efficiencies for each defect removal stage are computed. In this project, the removal efficiency was 100% for requirements and design review, only 55% for code review, only 32% for unit testing, 91% for system testing, and 100% for acceptance testing (as only those defects removed before the end of acceptance testing are known). The overall defect removal efficiency is 97.5%, which is satisfactory; the goal was 97%. The defects distribution with respect to severity and defect type has also been computed.

Finally, a causal analysis provides the possible process reasons for the situations in which the planned goals were not met. In this project, the reasons for performance deviation are discussed, along with the performance data. The lessons learned from this project are summarized here. The process assets that have been submitted are also recorded.

Closure Report of the ACIC Project

1. GENERAL INFORMATION

Project Code

Xxxxx

Life Cycle

Development, Full life cycle

Business Domain

Finance. Web-based application for managing accounts.

Project leader/Module Leader

Xxxxxx

Business Manager

 

Software Quality Adviser

Xxxxx

2. PERFORMANCE SUMMARY

Performance Parameter

Actual

Estimated

Deviation

Reasons for Deviation (If Large)

Total Effort (person-days)

597

501

19%

Two major change requests that came.

Peak Team Size

9

9

0

N/A

Start Date

03 Apr 2000

03 Apr 2000

0

N/A

End Date

03 Nov 2000

30 Nov 2000

27 Days

Two major change requests consumed more than 5% of the effort.

Quality (number of defects delivered per FP)

0.002

0.0125

 

Quality improved because of defect prevention and use of incremental process.

Productivity

58

57

2%

N/A

Cost of quality

31.4%

33%

5%

N/A

Defect injection rate

0.022

0.03

26%

Improved because of defect prevention.

Defect removal efficiency

97.4

97

Small

N/A

3. PROCESS DETAILS

Process Tailoring

         Rational Unified Process was employed.

         Development and analysis were done iteratively 3 iterations for development and 2 for design and analysis.

         Requirement traceability was done through Requisite Pro tool.

4. TOOLS USED

Notes on Tools Used

         External Tools: VSS, VJA, Requisite Pro, MSP

         Internal Tools: BugsBunny, WAR

5. RISK MANAGEMENT

Risks identified at the start of the project

Risk 1

Lack of support from database architect and database administrator of the customer

Risk 2

Improper use of RUP, as it is being used for the first time

Risk 3

Personnel attrition

Risk 4

Problems with working on customer's database over the link

Risks encountered during the project

Risk 1

Impact of conversion to VAJ 3.0

Risk 2

Lack of support from database architect and database administrator of the customer

Risk 3

Improper use of RUP, as it is being used for the first time

Risk 4

Personnel attrition

Notes on Risk Mitigation

Risk1: Clearly articulating the risk helped in customer agreeing to postpone the conversion with proper budgeting of its impact.

Risk2: Mitigation strategies of careful and advance planning and employing the on-site coordinator were effective.

Risk3: Training the team in RUP was effective. So was keeping the customer informed.

Risk 4: Remained as a risk, although it did not materialize. Impact would have been minimal because multiple people were kept informed of each critical activity.

6. SIZE

 

Estimated

Actual

Number of simple use cases

5

5

Number of medium use cases

9

9

Number of complex use cases

12

12

Notes on Estimation

Classification Criteria. The standard definition of simple, medium, and complex was used for classifying the use cases. This worked fine.

Final System Size in FP

The size of the final source is measured in LOC. It is normalized to FP by using the published conversion tables. For Java, the published tables suggest that 21 LOC equals 1 FP and for COBOL, 107 LOC equal 1 FP.

Output Language

Size in LOC

Size in FP

Java

33,865

1612

COBOL

1241

12

7. SCHEDULE

Phase

Actual Elapsed

Time (days)

Estimated Time (days)

% Slippage

Reasons for Slippage

Requirements

28.67

31

6.5

 

High-level design

0

0

0.0

 

Detailed design

38.8

42

6.7

 

Coding

132

135

1.6

 

Unit testing

9

10

9.3

 

Total - Build

141

144

2.1

 

Integration test

40

40

0

 

System testing

15

0

0.0

 

Acceptance testing

30

10

200.0

AT completion was extended on customer's request.

8. EFFORT

Distribution over Life-Cycle Stages

Stage

Task

Review

Rework

Total

Requirements

210.0

10.0

60.0

280.0

High-level design

0.0

0.0

0.0

0.0

Detailed design

652.0

14.0

29.5

695.5

Coding

1188.0

39.5

76.5

1304.0

Unit testing

129.5

0.0

17.0

146.5

Integration testing

567.5

6.0

160.5

734.0

System testing

90.0

0.0

0.0

90.0

Acceptance testing

336.5

0.0

0.0

336.5

Total - LC stages

3173.5

69.5

343.5

3586.5

 

 

 

 

 

Project management

733.1

0.0

0.0

733.1

Training

104.5

0.0

0.0

104.5

CM

317.0

0.0

0.0

317.0

Misc.

488.5

0.0

0.0

488.5

Total mgmt, training, and misc.

1643.0

0.0

0.0

1643.0

 

 

 

 

 

Total Effort (Person-hours)

4816.50

69.50

343.50

5229.50

Total Effort (Person-months)

25.76

0.37

1.84

27.97

Cost of Quality

graphics/12equ01.gif

Effort Distribution and Actual Versus Estimated

 

Actual

Estimated

 

 

Stage

Effort (person-hours)

%

Effort (person-hours)

%

% Deviation

Reasons or Deviation

Requirements

280

5.35

475.0

10

30

Overestimated this effort (data from earlier project did not help because it did not have this phase).

Design (HLD and detailed)

695.5

13.30

569.0

12

22

Design took more time because team was inexperienced with Rational Rose and OOAD.

Coding

1304.0

24.94

1235.3

26

6

 

Unit testing

146.5

2.80

142.5

3

3

 

Integration testing

734.0

14.04

331.0

7

120

Much effort spent on fixing bugs introduced during reconciliation with Synergy and Window Resized code.

System testing

90.0

1.72

95.0

2

5

 

Acceptance testing

336.5

6.43

285.0

6

18

Acceptance testing was not completed on Nov 3 and was extended until Nov 23 due to delays from the customer.

Total LC stages

3586.5

68.58

3132.8

66

14.5

 

Project management

733.1

14.02

713.0

15

3

 

Training

104.5

2.00

455.0

10

77

 

CM

317.0

6.06

142.0

3

123

Deviation due to reconciliation issues.

Misc.

488.5

9.34

285.0

6

71

More because of training.

Total mgmt, training, and misc.

1643.0

31.42

1595.0

34

3.01

 

Total

5229.5

100

4727.8

100

10.6

 

9. DEFECTS

Defect Distribution

Stage Detected

Actual Number of Defects

% of Total Defects Found

Estimated Number of Defects

% of Total Estimated Defects

% Deviation

Req. and design review

11

10

29

20

62

Code review

58

50

29

20

100

Unit testing

15

13

57

40

73

Integration and system testing

29

25

25

17

16

Acceptance testing

3

2

5

3

40

Total

116

100

145

100

20

Reasons for Deviation

1.       Defect prevention reduced the defect injection rate in later stages, resulting in overall reduction in the defect injection rate.

2.       In the earlier project from which the estimates were derived, fewer code reviews were done and there was a heavier reliance on UT. In this project, because code reviews were done more rigorously and widely, more defects were found in reviews, leading to a substantial decrease in the defects found in unit testing.

Defect Removal Efficiencies

Defects Detection Stage

Defects Injection Stage

Defect Removal Efficiency

 

Req.

Build

Design

 

Req. review

5

 

 

100%

Design review

0

6

 

100%

Code review

0

0

58

55% ( 58 / 58 + 15 + 29 + 3 )

Unit testing

0

0

15

32% (15 / 15 + 29 + 3)

Integration/system testing

0

0

29

91% ( 29 / 29 + 3)

Acceptance testing

0

0

3

100%

Overall Defect Removal Efficiency = 113 / 116 = 97.4 %

Distribution by Severity

Sequence Number

Severity

Number of Defects

% of Total Defects

1

Cosmetic

26

22.4

2

Minor

51

44

3

Major

36

31

4

Critical

3

2.6

5

Others

 

Total

116

 

Distribution by Defect Type

Sequence Number

Defect Type

Number of Defects

% of Total Defects

1.

Logic

33

28.4

2.

Standards

29

25

3.

Performance

24

20.7

4.

Redundant code

14

12

5.

User interface

9

7.7

6.

Architecture

4

3.5

7.

Consistency

2

1.7

8.

Reusability

1

0.9

 

Total

365

 

10. CAUSAL ANALYSIS AND LESSONS LEARNED

There were very few large deviations in the process performance; the actual performance was close to what was expected. The reasons for the deviations, where they are large, are given along with the deviation. Some key lessons learned are:

1.       Incremental or phased development is extremely helpful in achieving higher quality and productivity because data from the first phase can be used to improve the remaining phases through defect prevention.

2.       Defect prevention can substantially reduce the defect injection rate. In terms of effort also, defect prevention pays off handsomely; by putting in a few hours of effort, up to 5 to 10 times effort savings can be achieved in the form of reduced rework effort.

3.       If a change request has a major impact, discussion with the customer using a detailed impact analysis can be very helpful in setting the right expectations and doing a proper cost-benefit analysis (which may result in postponement of the change, as happened in this project).

4.       The defect removal efficiencies of code reviews and unit testing are very low. Processes for both, and implementation of these processes, need to be reviewed to improve these numbers. In this project, the system/integration testing compensated for the poor performance of reviews and unit testing. However, for larger projects, this may not be possible and poor performance in reviews and unit testing can have adverse effects on quality.

11. PROCESS ASSETS SUBMITTED

Project management plan, project schedule, configuration management plan, Java coding standards, code review checklist, integration plan review checklist, impact analysis checklist, causal analysis reports for defect prevention.

12. REFERENCES

Omitted.

 



Software Project Management in Practice
Linux Annoyances for Geeks: Getting the Most Flexible System in the World Just the Way You Want It
ISBN: 0596008015
EAN: 2147483647
Year: 2005
Pages: 83
Authors: Michael Jang

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net