# Summary of Important Points

## Summary of Important Points

Table 2-8 provides the highlights of this chapter.

Table 2-8: Summary of Important Points

Point of Discussion

Summary of Ideas Presented

• Uncertainty is present in every project.

• Risk management is the process, but probability and statistics provide the mathematical underpinning for the quantitative analysis of project risk.

Probability and projects

• The outcome of any single event, whether it is a coin toss or a project, cannot be known with certainty, but the pattern of behavior of a random event can be forecast.

• Single-point deterministic estimates forego the opportunity to estimate and forecast the impact of opportunity and threat.

• The relative frequency use of probability, such as "one chance in ten," forecasts a specific outcome; the subjective use of probability, such as "there is a 20% chance of rain today," expresses a "confidence" of an event within a range.

• Project events that are not mutually exclusive, or impose conditions that obviate independence between them, generally reduce the probability that either event will happen as planned.

• All probabilities of a single event must fall in the space of "zero to one" such that the equation p + (1-p) = 1 holds at all times.

Random variables

• If the outcome of a specific event has a numerical value, then the outcome is a variable, and because its value is not known with certainty before the fact, and could take on more than one value from event to event, it is a random variable.

• Random variables can be discrete or continuous over a range.

• The probability function, or probability distribution, relates the functional mathematical relationship between a random variable's value and the probability of obtaining that value. Summing or integrating the probability function over the range gives the cumulative probability function. The cumulative probability function ranges over 0 to 1.

Probability distributions for projects

• There are four probability distributions commonly used in projects: Normal, BETA, Triangular, and Uniform. Other distributions are helpful in estimating statistical parameters relevant to projects, but are usually left to statisticians.

• BETA and Triangular are often employed at the work package level in the WBS. The Normal distribution is often employed in summary results.

Useful statistics for project managers

• Statistics are data; data do not have to be analyzed to be statistics.

• Statistical methods are by and large methods of approximation and estimation.

• Expected value is the most important statistic for project managers. It is the single best estimator of the true mean of a random variable.

• Other useful statistics include: arithmetic average, sample average, variance, standard deviation, mean, mode, and median.

Statistics of distributions

• For the same range of value of a random variable, the expected value becomes more pessimistic moving from BETA to Triangular to Normal.

• Approximations to all the major statistics are available in simple arithmetic form for the most used distributions in projects.

The Law of Large Numbers and the Central Limit Theorem

• Regardless of the distribution of a random variable, the sample average of a number, n, of observations of the random variable's outcome will itself be a random variable of Normal distribution with mean equal to the mean of the population distribution and variance 1/n of the population variance.

• The probability that an event outcome will have a value different from the mean grows less probable as 1/y2 where y is the distance from the mean.

Confidence limits

• Confidence in a statistical sense means "with what probability the outcome will be within a range of values."

• Estimating confidence stretches the project-forecasting problem from estimating the probability of a specific value for an outcome to the problem of forecasting an outcome within certain limits of value.

Covariance and correlation

• Oftentimes, the outcome of one random variable is dependent on another. The degree of dependence is described by the correlation of one on the other. The covariance, equal to the correlation times a factor, adds to or subtracts from the sum of the variances of each of the random variables.