Estimating Methods for Projects


Estimating Methods for Projects

There is no single method that applies to all projects. Estimating is very domain specific. Construction, software, pharmaceuticals, packaging, and services, just to name a few of perhaps hundreds if not thousands of domains, have unique and specific estimating methodologies. Our intent is to discuss general principles that apply universally. Project managers are in the best position to adapt generalities to the specific project instance.

Estimating Concepts

The objectives of performing an estimate are twofold: to arrive at an expected value for the item being estimated and to be able to convey a figure of merit for that estimate. In this book we will focus on estimating deliverables on a WBS. The figure of merit we will use is the confidence interval that is calculable from the statistical data of the underlying distribution of the expected value of the estimate.

Most estimating fits into one of four models as illustrated in Figure 3-8:

  • Top-down value judgments from the business side of the project balance sheet conveyed to the project team

  • Similar-to judgments from either side of the project balance sheet, but most often from the business side

  • Bottom-up facts-driven estimates of actual work effort from the project side of the project balance sheet conveyed to the project sponsor

  • Parametric calculations from a cost-history model, developed by the project team, and conveyed to the project sponsor

click to expand
Figure 3-8: Estimating Concepts.

Naturally, it is rare that a project would depend only on one estimating technique; therefore, it is not unusual that any specific project team will use all the estimating methods available to it that fit the project context. However, let us consider them one by one.

Top-Down Estimates

Top-down estimates could be made by anyone associated with the project, but most often top-down estimates come from the business and reflect a value judgment based on experience, marketing information, benchmarking data, consulting information, and other extra-project information. Top-down estimates rarely have concrete and verifiable facts of the type needed by the project team to validate the estimate with the scope laid out on the WBS.

Working with top-down estimates requires the steps shown in Table 3-3. Project risks are greatest in this estimating methodology, and overall cost estimates are usually lowest. In its purest form, top-down estimating foregoes the quantitative information that could be developed by the project team regarding the project cost. Usually, if there is an independent input to cost developed by the project team, the purpose of such an independent input from the project side is to provide comparative data with the top-down estimate. Such a comparison serves to establish the range or magnitude of the risks of being able to execute the project for the top-down budget. Because risks are greatest in top-down estimating, the risks developed in response to top-down budgets require careful identification, minimization planning, and estimation before the project begins. Risks are quantified and made visible on the project side of the project balance sheet.

Table 3-3: Top-Down Estimates

Step

Description

Receive estimates from the business

Estimates from the business reflect a judgment on the investment available given the intended scope and value to the business.

Interview business leaders to determine the intended scope

Scope is the common understanding between the business and the project. Interviews provide the opportunity to exchange information vital to project success.

Verify assumptions and validate sources, if any

The business judgment on investment and value is based on certain assumptions of the business managers and may also include collateral information that is useful to the project manager.

Develop WBS from scope

WBS must contain all the scope, but only the scope required by the business sponsors.

Allocate top-down resources to WBS

Allocation is a means of distributing the investment made possible by the business to the elements of scope on the WBS.

Cost account managers identify risks and gaps

Cost account managers have responsibility for elements of the WBS. They must assess the risk of performance based on the allocation of investment to the WBS made by the project manager.

Negotiate to minimize risks and gaps

Once the risks are quantified and understood, a confidence estimate can be made of the probability of meeting the project scope for the available investment. Negotiations with the business sponsors narrow the gap between investment and expected cost.

Top-down estimate to the business side of project balance sheet

The business makes the judgment on how much investment to make. This investment goes on the business side of the project balance sheet.

Expected value estimate and risks to project side of balance sheet

Once the allocation is made to the WBS, there is opportunity for the project manager to develop the risks to performance, the expected value, and the confidence of meeting the sponsor's objectives.

A common application of the top-down methodology is in competitive bidding to win a project opportunity as a contractor to the project sponsor. In this scenario, the top-down estimate usually comes from a marketing or sales assessment of what the market will bear, what the competition is bidding, and in effect "what it will take to win." If the top-down estimate is the figure offered to the customer as the price, then the project manager is left with the task of estimating the risk of performance and developing the risk management plan to contain performance cost within the offered price:

Top-down offer to do business = Independently estimated cost of performance offered + Risk to close gap with top-down offer

From the steps in Table 3-3, we see that the project manager must allocate the top-down budget to the WBS. Doing so involves the following quantitative steps:

  • Develop the WBS from the scope statement, disregarding cost.

  • By judgment, experience, prototyping, or other means, determine or identify the deliverable of least likely cost, or the deliverable that represents the smallest "standard unit of work." Give that deliverable a numerical cost weight of 1 and call it the "baseline" deliverable, B. This procedure normalizes all costs in the WBS to the cost of the least costly deliverable.

  • Estimate the normalized cost of all other deliverables, D, as multiples, M, of the baseline deliverable: Di = Mi * B, where M is a random variable with unknown distribution and "i" has values from 1 to n. "n" is the number of deliverables in the WBS.

  • Sum all deliverable weights and divide the sum into the available funds to determine an absolute baseline cost of the least costly deliverable.

    ($Top-down budget)/(Mi) = Allocated cost of baseline task B

  • A "sanity check" on the cost of B is now needed. Independently estimate the cost of B to determine an offset, plus or minus, between the allocated cost and the independent estimate. This offset, O, is a bias to be applied to each deliverable in the WBS. The total bias in the WBS is given by:

    Total cost bias in WBS = n * O

  • Complete the allocation of all top-down budgets to the deliverables in the WBS according to their weights.

  • It is helpful at this point to simplify the disparate deliverables on the WBS to an average deliverable and its statistics. We know from the Central Limit Theorem that the average deliverable will be Normal distributed, so the attributes of the Normal distribution will be helpful to the project manager:

    Average deliverable cost = αd = (1/n) * [Di + (Mi * O)]

    σ2 of average deliverable = (1/n) * [(Di + O) - αd]2, and

    σ of average deliverable = (1/n) * [(Di + O) - αd]2

It is easier to calculate these figures than it probably appears from looking at the formulas. Table 3-4 provides a numerical example. In this example, a $30,000 top-down budget is applied to a WBS of seven deliverables. An offset is estimated at 23% of the allocated cost. Immediately, it appears that there may be a $6,900 risk to manage (23% of $30,000). However, we see from the calculations employing the Normal distribution that the confidence of hitting the top-down budget is only 24%, and with the $6,900 risk included, the confidence increases to only 50%. At 68% confidence, the level needed for many firms to do fixed price bidding, the risk increases significantly. Clearly if the risk is to be reduced, then the scope will have to be downsized or the budget increased, or more time given to estimating the offsets in order for this project to go forward.

Table 3-4: Top-Down Allocation to WBS

WBS Element

Weight, Mi

Allocated Budget, Di

Allocated Budget + (Mi * O)

Distance2 (d - average d)2

a

b

c

d

e

1

1.1.1

8

$10,435

$12,835

57,204,324

1.1.2

5

$6,522

$8,022

7,564,208

1.1.3

1

$1,304

$1,604

13,447,481

1.2.1

2

$2,609

$3,209

4,254,867

1.2.2

2.5

$3,260

$4,010

1,591,202

1.3.1

1.5

$1,957

$2,407

8,207,691

1.3.2

3

$3,913

$4,813

210,117

Totals:

23

$30,000

$36,900

92,479,890

Given: Top-down budget = $30,000

Evaluated least costly baseline deliverable, B = $1,304.35

Estimated independent cost of B = $1,604.35

Calculated baseline offset, O, = $300 = $1,604.35 - $1,304.35

n = 7

Mi = 23

Di = (Mi * B) = $30,000

Average deliverable, average d, with offset = $36,900/7 = $5,271

Variance, σ2 = 92,479,890/7 = 13,211,413

Standard deviation, σ = $3,635

Confidence calculations:

Total standard deviation of WBS = 7* σ2 = 92,479,890 = $9,616

24% confidence: WBS total $30,000 [*]

50% confidence: WBS total $36,900

68% confidence: WBS total $36,900 + $9,616 = $46,516

[*]From lookup on single-tail standard Normal table for probability of outcome = ($36,900 -$30,000)/$9,616 = 0.71σ below the mean value. Assumes summation of WBS is approximately Normal with mean = $36,900 and a = $9,616.

Once the risks are calculated, all the computed figures can be moved to the right side of the project balance sheet. Let us recap what we have so far. On the business side of the project balance sheet, we have the top-down budget from the project sponsors. This is a value judgment about the amount of investment that can be afforded for the deliverables desired. On the right side of the balance sheet, the project manager has the following variables:

  • The estimated "fixed" bias between the cost to perform and the available budget. In the example, the bias is $6,900.

  • The average WBS for this project and the statistical standard deviation of the average WBS. In this example, the average WBS is $36,900 (equal to the budget + bias) and the standard deviation is $9,616.

  • And, of course, the available budget, $30,000.

As was done in the example, confidences are calculated and the overall confidence of the project is negotiated with the project sponsor until the project risks are within the risk tolerance of the business.

Similar-To Estimates

"Similar-to" estimates have many of the features of the top-down estimate except that there is a model or previous project with similar characteristics and a cost history to guide estimating. However, the starting point is the same. The business declares the new project "similar to" another completed project and provides the budget to the new project team based on the cost history of the completed project. Of course, some adjustments are often needed to correct for the escalation of labor and material costs from an earlier time frame to the present, and there may be a need to adjust for scope difference. In most cases, the "similar-to" estimate is very much like a top-down estimate except that there is usually cost history at the WBS deliverable level available to the project manager that can be used by the project estimating team to narrow the offsets. In this manner, the offsets are not uniformly proportional as they were in the top-down model, but rather they are adjusted for each deliverable to the extent that relevant cost history is available.

The quantitative methods applied to the WBS are not really any different from those we employed in the top-down case except for the individual treatment of the offsets. Table 3-5 provides an example. We assume cost history can improve the offset estimates (or provide the business with a more realistic figure to start with). If so, the confidence in budget developed by the business as a "similar to" is generally much higher.

Table 3-5: Similar-To Estimates

WBS Element

Allocated Budget from Cost History, Di

Offset

Allocated Budget + (Mi * O)

Distance2 (d - average d)2

a

b

c

d

e

1

1.1.1

$10,435

$200

$10,635

39,813,357

1.1.2

$6,522

-$100

$6,422

4,396,315

1.1.3

$1,304

$300

$1,604

7,401,948

1.2.1

$2,608

$50

$2,658

2,778,889

1.2.2

$3,261

-$75

$3,186

1,297,618

1.3.1

$1,957

$100

$2,057

5,145,994

1.3.2

$3,913

-$200

$3,713

374,491

Totals:

$30,000

$275

$30,275

61,208,612

Given: Top-down budget = $30,000

Evaluated least costly baseline deliverable, B = $1 ,304

N = 7

Mi = 23

Di = (Mi * B) = $30,000

Average deliverable, average d, with offset = $30,275/7 = $4,325

Variance, σ2 = (1/7) * (61,208,612) = 8,744,087

Standard deviation, σ = $2,957

Confidence calculations:

Total standard deviation of WBS = 61, 208,612 = $7,823

46% confidence: WBS total $30,000 [*]

50% confidence: WBS total $30,275

68% confidence: WBS total $30,275 + $7,823 = $38,098

[*]From lookup on single-tail standard Normal table for probability of outcome = ($30,275 - $30,000)/$2,957 = 0.09σ below the mean value. Assumes summation of WBS is approximately Normal with mean = $30,275 and σ = $2,957.

Bottom-Up Estimating

So far we have seen that the project side of the balance sheet is usually a higher estimate than the figure given by the business. Although there is no business rule or project management practice that makes this so in every case, it does happen more often than not. That trend toward a higher project estimate continues in bottom-up estimating.

Bottom-up estimating, in its purest form, is an independent estimate by the project management team of the activities in the WBS. The estimating team may actually be several teams working in parallel on the same estimating problem. Such an arrangement is called the Delphi method. The Delphi method is an approach to bottom-up estimating whereby independent teams evaluate the same data, each team comes to an estimate, and then the project manager synthesizes a final estimate from the inputs from all teams.

The starting point for the estimating team(s) is the scope statement provided by the business. A budget from the business is provided as information and guidance. Parametric data developed from cost history are assumed to be unavailable. In practice, parametric data in some form are usually available, but we will discuss parametric data next.

Best practice in bottom-up estimating employs the "n-point" estimate rather than a single deterministic number. The number of points is commonly taken to be three: most likely, most pessimistic, and most optimistic (thus the expression "three-point estimates"). A distribution must be selected to go with the three-point estimate. The Normal, BETA, and Triangular are the distributions of choice by project managers. The BETA and Triangular are used for individual activities and deliverables; the Normal is a consequence of the interaction of many BETA or Triangular distributions in the same WBS. However, if there are deliverables with symmetrical optimistic and pessimistic values, then the Normal is used in those cases.

Table 3-6 provides a numerical example of bottom-up estimating using the BETA distribution. Recall that the Triangular distribution will give more pessimistic statistics than the BETA. Although individual deliverables are estimated with somewhat wide swings in optimistic and pessimistic range, overall the confidence of hitting a lower number with greater certainty is higher.

Table 3-6: Bottom-Up Estimates

WBS Element

Most Likely Estimate

Most Pessimistic Offset

Most Optimistic Offset

BETA Expected Value

BETA Variance

1

1.1.1

$11,000

$3,000

-$1,000

$11,333

444,444

1.1.2

$6,800

$4,000

-$700

$7,350

613,611

1.1.3

$1,500

$800

-$300

$1,583

33,611

1.2.1

$3,000

$2,000

-$500

$3,250

173,611

1.2.2

$3,100

$1,800

-$750

$3,275

180,625

1.3.1

$1,800

$800

-$300

$1,883

33,611

1.3.2

$3,700

$1,900

-$600

$3,917

173,611

Totals:

$32,591

1,653,124

Business desires project outcome = $30,000

Average deliverable from BETA = $32,591/7 = $4,656

Variance, σ2 = 1,653, 124/7 = 236,161

Standard deviation, σ = $486

Confidence calculations:

Total standard deviation of WBS = 1, 653,124 = $1,286

50% confidence: WBS total $32,591 [*]

68% confidence: WBS total $32,591 + $1,286 = $33,877

[*]Assumes approximately Normal distribution of WBS summation with mean = $32,594 and σ = $1,286.

Parametric Estimating

Parametric estimating is also called model estimating. Parametric estimating depends on cost history and an estimate of similarity between that project history available to the model and the project being estimated. Parametric estimating is employed widely in many industries, and industry-specific models are well published and supported by the experiences of many practitioners. [12] The software industry is a case in point with several models in wide use. So also do the general industry that builds hardware, as well as the construction industry, environmental industry, pharmaceuticals, and many others have many good models in place. The general characteristics of some of these models are given in Table 3-7.

Table 3-7: Parametric Estimating Models

Estimating Application

Model Identification

Key Model Parameters and Calibration Factors

Model Outcome

Construction

PACES 2001

Covers new construction, renovation, and alteration

Covers buildings, site work, area work

Regression model based on cost history in military construction

Input parameters (abridged list): size, building type, foundation type, exterior closure type, roofing type, number of floors, functional and utility space requirements

Media/waste type: cleanup facilities and methods

Specific cost estimates (not averages) of specified construction according to model

Project costs

Life cycle costs

Environmental

RACER

Handles natural attenuation, free product removal, passive water treatment, minor field installation, O&M, and phytoremediation

Technical enhancements to over 20 technologies

Ability to use either system costs or user-defined costs

Professional labor template that creates task percentage template

Programming and budgetary estimates for remedial environmental projects

Hardware

Price H

Key parameters: weight, size, and manufacturing complexity

Input parameters: quantities of equipment to be developed, design inventory in existence, operating environment and hardware specifications, production schedules, manufacturing processes, labor attributes, financial accounting attributes

Cost estimates

Other parameter reports

SEER H

WBS oriented

Six knowledge bases support the WBS elements: application, platform, optional description, acquisition category, standards, class

Cost estimates are produced for development and production cost activities (18) and labor categories (14), as well as "other" categories (4)

Production cost estimates, schedules, and risks associated with hardware development and acquisition

NAFCOM (NASA Air Force Cost Model)

Available to qualified government contractors and agencies

WBS oriented

Subsystem oriented within the WBS

Labor rate inputs, overhead, and G&A costs

Integration point inputs

Test hardware and quantity

Production rates

Complexity factors

Test throughput factors

Integrates with some commercial estimating models

Estimates design, development, test, and evaluation (DDT&E) flight unit, production, and total (DDT&E + production) costs

Software

COCOMO 81 (waterfall methodology)

Development environment: detached, embedded, organic

Model complexity: basic, intermediate, detailed

Parameters used to calibrate outcome (abridged list): estimated delivered source lines of code, product attributes, computer attributes, personnel attributes, project attributes (with breakdown of attributes, about 63 parameters altogether)

Effort and duration in staff hours or months

Other parametric reports

COCOMO II (object oriented)

Development stages: applications composition, early design, post architecture (modified COCOMO 81)

Parameters used to calibrate outcome (abridged list): estimated source lines of code, function points, COCOMO 81 parameters (with some modification), productivity rating (Stage 1)

Effort and duration in staff hours or months

Other parametric reports

Price S

Nine categories for attributes: project magnitude, program application, productivity factor, design inventory, utilization, customer specification and reliability, development environment, difficulty, and development process

Effort and duration in staff hours or months

Other parametric reports

SEER-SEM

Three categories for attributes: size, knowledge base, input

Input is further subdivided into 15 parameter types very similar to the other models discussed

Effort and duration in staff hours or months

Other parametric reports

Most parametric models are "regression models." We will discuss regression analysis in Chapter 8. Regression models require data sets from past performance in order that a regression formula can be derived. The regression formula is used to predict or forecast future performance. Thus, to employ parametric models they first must be calibrated with cost history. Calibration requires some standardization of the definition of deliverable items and item attributes. A checklist specific to the model or to the technology or process being modeled is a good device for obtaining consistent and complete history records. For instance, to use a software model, the definition of a line of code is needed, and the attributes of complexity or difficulty require definitions. In publications, the page size and composition require definition, as well as the type of original material that is to be received and published. Typically, more than ten projects are needed to obtain good calibration, but the requirements of cost history are model specific.

Once a calibrated model is in hand, to obtain estimates of deliverable costs the model is fed with parameter data of the project being estimated. Model parameters are also set or adjusted to account for similarity or dissimilarity between the project being estimated and the project history. Parameter data could be the estimated number of lines of software code to be written and their appropriate attributes, such as degree of difficulty or complexity. Usually, a methodology is incorporated into the model. That is to say, if the methodology for developing software involves requirements development, prototyping, code and unit test, and system tests, then the model takes this methodology into account. Some models also allow for specification of risk factors as well as the severity of those risks.

Outcomes of the model are applied directly to the deliverables on the WBS. At this point, outcomes are no different than bottom-up estimates. Ordinarily, these outcomes are expected values since the model will have taken into account the risk factors and methodology to arrive at a statistically useful result. The model may or may not provide other statistical information, such as the variance, standard deviation, or information about any distributions employed. If only the expected value is provided, then the project manager must decide whether to use some independent evaluation to develop statistics that can be used to develop confidence intervals. The model outcome may also specify or identify dependencies accounted for in the result; as we saw in the discussion of covariance, dependencies change the risk factors.

Table 3-8 provides a numerical example of parametric estimating practices in the WBS.

Table 3-8: Parametric Estimating

WBS Element

Deliverable

Units

Quantity

Parametric Cost

Model Expected Value

Model Standard Deviation, σ

Calculated Variance, σ2

1

1.1.1

Software code

Lines of code

5,000

$25

$125,000

$25,000

625,000,000

1.1.2

Software test plans

Pages

500

$400

$200,000

$10,000

100,000,000

1.1.3

Software requirements

Numbered items

800

$100

$80,000

$12,000

144,000,000

1.2.1

Tested module

Unit tests

2,000

$100

$200,000

$30,000

900,000,000

1.2.2

Integrated module

Integration points

1,800

$50

$90,000

$3,500

12,250,000

1.3.1

Training manuals

Pages

800

$400

$320,000

$4,000

16,000,000

1.3.2

Training delivery

Students

900

$500

$450,000

$5,000

25,000,000

Totals:

$1,465,000

1,822,250,000

Average deliverable from model = $1,465,000/7 = $209,286

Variance, σ2 = 1,822,250,000/7 = 260,321,429

Standard deviation, σ = 260.321 ,429 = $16,134

Confidence calculations:

Standard deviation of total expected value = (1 ,822,250,000) = $42,687

50% confidence: WBS total $1 ,465,000 [*]

68% confidence: WBS total $1,465,000 + $42,687 = $1,507,687

[*]Assumes approximately Normal distribution of WBS summation with mean = $1 ,465,000 and σ = $42,687.

[12]A current listing of some of the prominent sources of information about parametric estimating can be found in "Appendix E, Listing of WEB Sites for Professional Societies, Educational Institutions, and Supplementary Information," of the Joint Industry/Government "Parametric Estimating Handbook," Second Edition, 1999, sponsored by the Department of Defense. Among the listings found in Appendix E are those for the American Society of Professional Estimators, International Society of Parametric Analysis, and the Society of Cost Estimating and Analysis.