The following two solutions illustrate approximations to some of the AHP calculations that are performed with Expert Choice. Although these approximate calculations can be somewhat tedious and may not produce the same results, particularly when there are some inconsistencies in judgments, they give you a more detailed look at some of AHP's intricacies. We will illustrate two approximate methods for arriving at an AHP solution to Case Study 8.1. The approximations can be good in some cases but not so good in others. In general, the smaller the clusters in an AHP model and the more consistent the judgments, the more accurate the approximate calculations will be. Following this approximate solution, we will present another variationthe one by Michael Brassard. Approximate Solution Method 1Step 1: Brainstorm and Construct a Hierarchical Model of the ProblemStep 1 in the solution derivation is same as the one used earlier, in the section "Case Study 8.1 Solution Using Expert Choice." Step 2: Construct a Pairwise Comparison Matrix for the ObjectivesTo establish the priority ranking among the three alternatives, AHP uses pairwise relative comparisons. A decision-maker stipulates his or her judgments regarding the relative importance of the objectives in attaining the overall goal. The decision-maker's judgments are entered into the elements above the diagonal in Table 8.3 using the numeric representation of Saaty's verbal scale. The diagonal elements of Table 8.3 are all 1 because an element's importance when compared to itself is 1. The elements below the diagonal are the reciprocals of the elements above the diagonals, as per Axiom 2 of AHP (if A is n x B, B is 1/ n x A). A fractional judgment, such as 1/3 when comparing the row element Implementation Cost to the column element Technical Risk, means that the column element is more important than the row elementin this case, moderately more important.
Because it is not easy to calculate the principal right eigenvector manually, an approximate technique is used in which the priorities are approximated by the normalized sum of the judgments in each of the rows (or columns). For example, in Table 8.3, the pairwise judgments are summed for each row. The sums are then normalized, as shown in the last column of Table 8.3. The "weights" represent an approximation to the relative importance or priorities of the objectives. For example, the priority of Implementation Cost using this approximate technique is .031, and the exact solution, computed by Expert Choice, is .033. Similarly, the approximate priority for Competitive Advantage is .383, and the exact solution is .402. The more consistent the judgments, the closer the approximate solution is to the exact solution. Step 3: Construct a Pairwise Comparison Matrix for the AlternativesThis step involves pairwise comparisons of the alternatives with respect to each objective in separate matrices, one for each objective. The procedure is similar to the one used for constructing pairwise matrix for objectives. The question to ask here is, which alternative is relatively more important than the other in terms of its contribution toward achieving the particular objective being considered? The five matrices are constructed as shown in Tables 8.4 through 8.8.
Step 4: Approximate the Priorities from the JudgmentsAfter the various matrices have been constructed, we need to determine the alternatives' priorities with respect to each of the covering objectives. As mentioned earlier, this involves a mathematical computation of eigenvalues and eigenvectors that is beyond the scope of this book. We will use the following procedure, which is a good approximation of the synthesized priorities:[14]
This three-step procedure is carried out for the pairwise comparison matrix for the alternatives vis-à-vis Implementation Cost (see Table 8.4). It results in the normalized pairwise comparison matrix shown in Table 8.9. The three-step construction of the normalized matrix is shown in Tables 8.9 through 8.11.
The preceding calculations provide the priorities for the three alternatives vis-à-vis the Implementation Cost objective. The priority vector with respect to Implementation Cost is
compared to the exact solution obtained with Expert Choice:
Similarly, the priority vectors for the other objectives are calculated and are as follows:
The exact solution obtained with Expert Choice is
Because the inconsistencies of the judgments are rather small for these matrices, the approximate priorities are close to those obtained with Expert Choice. In fact, they are exactly the same for the two matrices that are perfectly consistent. You will next see how to manually approximate the calculations of the judgment consistencies. Step 5: Estimate the Consistency RatioAHP does not require perfectly consistent judgments, but it provides an index for measuring consistency for each matrix and for the entire hierarchy. Thus, it is possible to find where the inconsistent judgments are. You can change them if you want to, although this is not required. The goal is not to be perfectly consistent, but to be as accurate as possible. A certain amount of inconsistency is required to learn new things.[15] A inconsistency index of .10 or less is usually considered reasonable. An approximate computation of consistency index and consistency ratio (or what Expert Choice calls the inconsistency ratio, because the higher the value, the more inconsistent the judgments) is as follows:[16]
Step 6: Develop the Overall Priority RankingFollowing the preceding computations, the priority vectors for all the objectives are tabulated as shown in Table 8.12.
Table 8.12 reveals the following priorities:
No alternative is most preferred for all the objectives. Therefore, to make a choice, we need to look at the relative importance of all the alternatives computed in Table 8.3. The objective priorities from that table appear in Table 8.13.
The overall priority of each alternative can be calculated by summing the products of the objective priority and multiplying by the priority of its decision alternative, as shown in Table 8.13. We can compute overall ranking priority as follows:
This gives us the overall rankings shown in Table 8.14.
The preceding provides the following rankings:
You can see that, for this simple example, the priorities from the approximate calculations are very close to the Ideal Mode exact calculations produced by Expert Choice. (The Distributive and Ideal synthesis modes are explained in the Expert Choice solution arrived at earlier in the chapter.) Approximate Solution Method 2: Brassard's Full Analytical Criteria Method for PrioritizationYou may recall that this method was introduced in Chapter 7, along with other prioritization matricesnamely, the Consensus Criteria Method and the Combination I.D./Matrix Method. Of the three prioritization matrices developed by Brassard, the Full Analytical Criteria Method is the most rigorous and time-consuming. Although it is based on Saaty's AHP methodology, it is not nearly as accurate and can lead to different results. Nevertheless, the following sections describe the steps required to construct a Brassard Full Analytical Criteria Method.[17] Step 1: Agree on the Ultimate Goal to Be AchievedWe will use the MIS Director's IT Dilemma case for our analysis. This step has to do with clearly stating the ultimate goal of the prioritization process. In this case it is the "overall goal" of Case Study 8.1: "Select the best upgrade path." Step 2: Create a List of Criteria to Be Applied to the Options GeneratedThe criteria or objectives are Implementation Cost, Technical Risk, Business Risk, Competitive Advantage, and Time to Implement. Step 3: Judge the Relative Importance of Each Criterion as Compared to Every Other CriterionHere each criterion is rated against every other criterion in a paired comparison using the following scale:
Table 8.15 shows the pairwise comparison for the preceding criteria. It is an L-shaped matrix. A criterion when compared to itself results in a numerical score of 1. If it is more important than the criterion it is compared to, it gets a numerical score of 5. When it's less important than the criterion it is compared to, it gets a numerical score of 1/5. If it is much more important than the criterion it is compared to, it gets a numerical score of 10. If it's much less important than the criterion it is compared to, it gets a numerical score of 1/10. The individual comparative scores for each criterion are added and their proportionate weights are calculated, as shown in the last column of Table 8.15. The weights represent the relative importance or priorities of various criteria. The overall pairwise comparison is tabulated in Table 8.15.
The row weights represent the criteria's relative importance. Of the five criteria, Business Risk and Competitive Advantage are significantly more important than Implementation Cost, Technical Risk, and Time to Implement: scores of .379 and .379 compared to .010, .116, and .116, respectively. Given such dramatic differences, the less-important criteria can be dropped from any further analysis.[18] Therefore, we'll use just Business Risk and Competitive Advantage as the criteria against which the three alternatives (A, B, and C) will be compared to achieve the overall goal of selecting the best upgrade path. Step 4: Compare All the Alternative Solutions to the Individual CriteriaThis step involves pairwise comparison of the alternative solutions with respect to each criterion in separate matrices, one for each criterion. The procedure is similar to the one used to construct a pairwise matrix for objectives (Table 8.15). The question to ask here is, which alternative solution has a relatively greater impact than the other in terms of its contribution toward achieving the particular criterion being considered? The rows and columns are summed to get the totals, and the relative weights of each alternative solution are calculated as a percentage (fraction) of the total. The two matrices are constructed as shown in Tables 8.16 and 8.17.
Step 5: Compare All the Alternative Solutions Based on All Criteria CombinedThis is the summary matrix and is prepared by using an L-shaped matrix (see Table 8.18). All the alternative solutions are listed on the vertical side of the matrix. All the criteria considered previously (see Tables 8.16 and 8.17) are listed on the horizontal side. All the percentage scores for the two criteria are recorded as shown. All these scores are multiplied by the weighted score of each criterion in Table 8.15.
The totals of rows and columns are calculated, as are the weights. This provides the following ranking:
The most preferable alternative overall is alternative B. For this simple example, the ranking results from Brassard's Full Analytical Criteria Method are the same as from the AHP deployments presented earlier in the chapter. As you can see from Table 8.19, alternative B has a much higher relative priority ranking using Brassard's Full Analytical Criteria Method than from Approximate Solution Method 1 and from Expert Choice, both of which are pretty close. As a manual calculation, Method 1 provides a solution that is much closer to the one obtained from Expert Choice than the one obtained from Brassard's Method. For accuracy of results, additional analytical features, and sheer convenience of use, we emphasize Expert Choice.
|