For measuring process improvements over time, gross measures of entire projects are not granular enough to be effective. We have learned that it is necessary to get down to the level of specific activities in order for process improvements to become visible and measurable.
Activity-based costs can highlight the activities that process improvements benefit, and also illustrate activities where no tangible benefits are noted. This kind of analysis is not be possible using only project-level data. Software benchmarking companies have found that function point metrics are superior to the older LOC metrics for measuring activity-based costs and schedules. The reason is that function points can measure noncoding activities such as requirements, design, documentation, and management.
We use the function point metrics defined by IFPUG. The current standard for function points published by IFPUG is version 4.1, which is used here. For a general introduction to the topic of function point analysis, refer to Dreger (1989) or to Garmus and Herron (1995).
Note that while IFPUG function points are the most widely used metric in the United States and Western Europe, other forms of function point metric are in use too. For example, in the United Kingdom a function point metric called Mark II function points are very common. For a discussion of the Mark II function points, refer to Symons (1991).
Table 18.4 illustrates a hypothetical project of 1,000 function points ( roughly 125,000 C source statements). The organization producing this application would be a typical civilian organization at CMM level 1. Level 1 organizations are not very sophisticated in software development approaches. Thus, level 1 organizations often have missed schedules, cost overruns, and poor-quality products.
The most obvious characteristic of CMM level 1 organizations is that testing is both the most expensive and the most time-consuming activity. The root cause of this phenomenon is that CMM level 1 organizations usually have excessive defect levels and are deficient in two key quality factors: defect prevention and pretest reviews and inspections.
By contrast, Table 18.5 illustrates exactly the same size and kind of software project. However, Table 18.5 shows the results that have been noted in somewhat more mature level 3 organizations on the Software Engineering Institute capability maturity model scale. For the level 3 organization, testing has diminished significantly in terms of both timing and costs. This is because defect prevention methods have improved and pretest design reviews and code inspections have both been strengthened .
Table 18.4. Example of Activity-Based Cost Analysis for SEI CMM Level 1
Application class |
Systems software |
|||||
Programming language(s) |
C |
|||||
Size in function points (FP) [a.] |
1,000 |
|||||
Size in lines of code (LOC) [b.] |
125,000 |
|||||
Work hours per month |
132 |
|||||
Average monthly salary (burdened) [c.] |
$7,500 |
|||||
Activity |
Work Hours per FP |
Staff [d.] |
Effort (Person Months) [e.] |
Schedule (Months) [f.] |
Costs by Activity |
Percent of Costs |
---|---|---|---|---|---|---|
Requirements |
1.20 |
2.00 |
9.09 |
4.55 |
$ 68,182 |
4% |
Design |
2.93 |
3.33 |
22.22 |
6.67 |
166,667 |
11 |
Design reviews |
0.38 |
4.00 |
2.86 |
0.71 |
21,429 |
1 |
Coding |
7.76 |
6.67 |
58.82 |
8.82 |
441,176 |
29 |
Code inspections |
0.53 |
8.00 |
4.00 |
0.50 |
30,000 |
2 |
Testing |
8.25 |
6.67 |
62.50 |
9.38 |
468,750 |
31 |
Quality assurance |
1.32 |
1.00 |
10.00 |
10.00 |
75,000 |
5 |
Documentation |
1.10 |
1.00 |
8.33 |
8.33 |
62,500 |
4 |
Management |
3.57 |
1.00 |
27.03 |
27.03 |
202,703 |
13 |
Totals |
27.04 |
6.33 |
204.85 |
32.35 |
$1,536,406 |
100% |
FP per month |
4.88 |
|||||
LOC per month |
610 |
|||||
Cost per FP |
$1,536.41 |
|||||
Cost per LOC |
$12.29 |
[a.] FP = function points per IFPUG Counting Practices Manual 4.1.
[b.] LOC = lines of noncommentary code.
[c.] Burdened = basic compensation and overhead costs.
[d.] Staff = number of workers assigned to major activities. Decimal values indicate some part-time personnel.
[e.] Person month = 22 workdays of 8 hours each.
[f.] Calendar months.
As Table 18.6 shows, a side-by-side analysis of the costs indicates an overall reduction of about 20% between the level 1 and level 3 versions, but the activity costs illustrate that most of the savings occur during the testing phase.
Indeed, the costs of inspections are higher, rather than lower, for the CMM level 3 version. Also, some costs such as those for user documentation are the same in both scenarios. Activity-based cost analysis allows a detailed scrutiny of differences between processes, and opens up a way to perform rather sophisticated cost and schedule models which cannot be done using coarser project-level data.
Table 18.5. Example of Activity-Based Cost Analysis for SEI CMM Level 3
Application class |
Systems software |
|||||
Programming language(s) |
C |
|||||
Size in function points (FP) [a.] |
1,000 |
|||||
Size in lines of code (LOC) [b.] |
125,000 |
|||||
Work hours per month |
132 |
|||||
Average monthly salary (burdened) [c.] |
$7,500 |
|||||
Activity |
Work Hours per FP |
Staff [d.] |
Effort (Person Months) [e.] |
Schedule (Months) [f.] |
Costs by Activity |
Percent of Costs |
---|---|---|---|---|---|---|
Requirements |
1.06 |
2.00 |
8.00 |
4.00 |
$ 60,000 |
5% |
Design |
2.64 |
3.33 |
20.00 |
6.00 |
150,000 |
12 |
Design reviews |
0.88 |
4.00 |
6.67 |
1.67 |
50,000 |
4 |
Coding |
6.00 |
6.67 |
45.45 |
6.82 |
340,909 |
28 |
Code inspections |
1.06 |
8.00 |
8.00 |
1.00 |
60,000 |
5 |
Testing |
3.30 |
6.67 |
25.00 |
3.75 |
187,500 |
15 |
Quality assurance |
2.20 |
1.00 |
16.67 |
16.67 |
125,000 |
10 |
Documentation |
1.10 |
1.00 |
8.33 |
8.33 |
62,500 |
5 |
Management |
3.30 |
1.00 |
25.00 |
25.00 |
$187,500 |
15 |
Totals |
21.53 |
6.33 |
163.12 |
25.76 |
$1,223,409 |
100% |
FP per month |
6.13 |
|||||
LOC per month |
766 |
|||||
Cost per FP |
$1,223.41 |
|||||
Cost per LOC |
$9.79 |
[a.] FP = function points per IFPUG Counting Practices Manual 4.1.
[b.] LOC = lines of noncommentary code.
If we turn now to quality, the results of improving both defect prevention and defect removal approaches in the level 3 example causes a very significant reduction in delivered defects. In turn , the reduced numbers of defects allow shorter and more cost-effective development cycles. When projects run late, it often happens that problems escape notice until testing begins. When major defects are found during testing, it is too late to bring the project back under control. The goal is to prevent defects or eliminate them before testing gets under way.
Table 18.6. Side-by-Side Comparison of Activity-Based Costs
Application class |
Systems software |
|||
Programming language(s) |
C |
|||
Size in function points (FP) [a.] |
1,000 |
|||
Size in lines of code (LOC) [b.] |
125,000 |
|||
Work hours per month |
132 |
|||
Average monthly salary (burdened) [c.] |
$7,500 |
|||
Activity |
SEI CMM Level 1 |
SEI CMM Level 3 |
Variance in Costs |
Variance Percent |
---|---|---|---|---|
Requirements |
$68,182 |
$60,000 |
-$8,182 |
-12.00 |
Design |
166,667 |
$150,000 |
-16,667 |
-10.00 |
Design reviews |
21,429 |
$50,000 |
28,571 |
133.33 |
Coding |
441,176 |
$340,909 |
-100,267 |
-22.73 |
Code inspections |
30,000 |
60,000 |
30,000 |
100.00 |
Testing |
468,750 |
$187,500 |
-281,250 |
-60.00 |
Quality assurance |
75,000 |
125,000 |
50,000 |
66.67 |
Documentation |
62,500 |
62,500 |
0.00 |
|
Management |
202,703 |
187,500 |
-15,203 |
-7.50 |
Totals |
$1,536,406 |
$1,223,409 |
-$312,997 |
-20.37% |
Cost per FP |
$1,536.41 |
$1,223.41 |
-$313.00 |
-20.37% |
Cost per LOC |
$12.29 |
$9.79 |
-$2.50 |
-20.37% |
[a.] FP = function points per IFPUG Counting Practices Manual 4.1.
[b.] LOC = lines of noncommentary code.
[c.] Burdened = basic compensation and overhead costs.
Table 18.7 illustrates the differences in defect potentials, defect removal efficiency levels, and delivered defects of the two cases. As can be seen, a combination of defect prevention and defect removal can yield significant reductions in delivered defect levels. Because finding and fixing defects is the most costly and time-consuming activity for software, projects that are successful in preventing defects or removing them via inspections will achieve shorter schedules and higher productivity as well as better quality.
If software process improvement is to become a mainstream technology it is important to demonstrate exactly what is being improved, and by how much.
Activity-based cost analysis illustrates that process improvement does not create a homogenous improvement of every activity equally. Improvements tend to be very significant for key activities such as testing but scarcely visible for other activities such as user documentation.
Table 18.7. SEI CMM Level 1 and Level 3 Defect Differences
Level |
Potential Defects [a.] |
Removal Efficiency [b.] (%) |
Delivered Defects |
Defects per Function Point [c.] |
Defects per KLOC [d.] |
---|---|---|---|---|---|
SEI Level 1 |
6150 |
85.01% |
922 |
0.92 |
7.38 |
SEI Level 3 |
3500 |
95.34 |
163 |
0.16 |
1.30 |
[a.] Effects likely to be encountered from the start of requirements through at least one year of customer use.
[b.] Percentage of potential defects found before delivery of the software to customers.
[c.] Function points per IFPUG Counting Practices Manual 4.1.
[d.] KLOC = one thousand lines of code.
Indeed, for a number of important activities such as design and code inspections and quality assurance work, the costs will be higher for more mature organizations at level 3 on the CMM than for those at level 1 on the CMM.
What Is Software Quality?
Software Development Process Models
Fundamentals of Measurement Theory
Software Quality Metrics Overview
Applying the Seven Basic Quality Tools in Software Development
Defect Removal Effectiveness
The Rayleigh Model
Exponential Distribution and Reliability Growth Models
Quality Management Models
In-Process Metrics for Software Testing
Complexity Metrics and Models
Metrics and Lessons Learned for Object-Oriented Projects
Availability Metrics
Measuring and Analyzing Customer Satisfaction
Conducting In-Process Quality Assessments
Conducting Software Project Assessments
Dos and Donts of Software Process Improvement
Using Function Point Metrics to Measure Software Process Improvements
Concluding Remarks
A Project Assessment Questionnaire