There are many ways to arrange meaningful metrics. This section shows some practical samples that you can copy and adapt for your use. For each report, you will see:
Reports are listed roughly in the order in which they appear in the last section ("More is not Better"). Not all reports are shown, however, only the more interesting and complex ones. Lead Generation AnalysisGoalTrack the outcome of various lead-generating initiatives so you can make decisions about what actions to take again and what actions to de- emphasize . FrequencyWeekly (and monthly, quarterly), depending on the size of the organization. SampleTable 10.2. Sample Lead Generation Analysis Metrics
CommentsIt's best to show actual costs and revenue figures, but if it's difficult to do within the confines of the system, you can simply report on leads generated. Of course, having the financial information available right there is much better. This productivity report is very similar to productivity reports for other functions. Telesales ProductivityGoalTrack volumes against targets per person and overall. FrequencyTypically daily (and weekly, monthly, quarterly). SampleTable 10.3. Sample Telesales Productivity Metrics
CommentsNote the combination of raw numbers and target achievement in the "leads" column. This is a simple example of how to make metrics richer and more meaningful. This productivity report is very similar to productivity reports for other functions. Sales PipelineGoalTrack the progression of support deals through the sales cycle by sales rep and overall. FrequencyTypically weekly but could be daily at quarter end and other sensitive periods. SampleTable 10.4. Sample Sales Pipeline Metrics
CommentsThis report computes the forecast based on a weighed formula rather than a straight addition of the amounts in the pipeline. In this example, "prospects" and "needs analysis" are not included in the forecast dollars at all, and others are included as a percentage only, except for closed accounts. Use the formula that works for your particular environment. It's clear that detailed information related to the report is critical to fully comprehend the forecast. For instance, what are the deals that are close to being closed? A great reporting environment would allow the manager to click on, say the committed number for Anna and to see the detailed list. This report lends itself well to slicing and dicing by region or industry. Some tool vendors propose inventive , graphic interpretation of this report, for instance in a radar screen configuration. The data, however, is always the same: what is in the pipeline and where is it? The structure of the report is comparable to that of the knowledge base productivity report, in that it follows items (deals here, documents there) from one state to the next . Sales ProductivityGoalTrack sales productivity, indicators. FrequencyProbably monthly, depending on sales volumes. SampleTable 10.5. Sample Sales Productivity Metrics
CommentsAdd targets and accomplishments against targets where they exist. This productivity report is very similar to productivity reports for other functions. Support ProductivityGoalTrack volumes and how quickly issues are traveling through the system. FrequencyTypically daily (and weekly, monthly, quarterly). Very busy centers will want to analyze volume in shorter chunks , down to each hour or even half- hour . SampleTable 10.6. Sample Support Productivity Metrics
CommentsThis is a good example of a "meaty" report. There's a lot of information in this table, but it's easy to read and to understand. Note how raw numbers and percentages are presented side by side. After a couple of weeks, managers should be able to tell at a glance whether anything extraordinary happened during the measurement period and whether any corrective action is required. The sample shown here is best for high-complexity support centers (ones where cases take lots of time and effort to resolve). In low-complexity support and service centers drop the columns for backlog and reopened cases and replace them with handle time and abandoned call rate. This would be the output for the report of a first-level manager. Higher-level managers would see results for subgroups rather than for individuals. This productivity report is very similar to other productivity reports for other functions. Support Issue DistributionGoalTrack the type of issues that are coming into the support center so you can staff properly and/or invest in preventive measures. FrequencyTypically weekly or even monthly, except for very busy, dynamic centers, that may want to run it daily. SampleTable 10.7. Sample Support Issue Distribution Metrics
CommentsThe usefulness of this report is directly related to how meaningful the categories are, so pick them carefully . Typically this is done through hard-coded categories in the case-tracking system. It's best to use a relatively small number of categories (say around 7, and no more than 10) and subdivide them as required rather than allowing a large list of categories. Beware of over-detailed categorizations. Anything over 50 categories is pretty much useless: who can tell what is what with so many choices? The sample is geared towards case topics but you could run something similar for root causes (say, bug/documentation error/ user error/product usability, etc.) A root cause report is key for proactive efforts. This report is similar to the lead generation analysis report, although it's much simpler. Support Case AgingGoalEnsure that no case falls through the cracks. FrequencyTo match to your target resolution time (daily if it's a day, weekly if it's a week or more). SampleTable 10.8. Sample Support Case Aging Metrics
CommentsYou need to define what an "aging" case is. Here, it's any case older than a week. Some centers instead focus on the last update to the case. I think it's a lot safer to scrutinize all cases older than a particular target. Another approach is to limit the aging list to "really important" cases, say only P1 cases. Properly speaking, this is not a true "metric," but simply a list of cases. It is, however, very handy! You could run a very similar report on deal aging for sales, showing all deals that have lingered a bit too much at each stage in the process. This is a good template for any kind of aging report. Knowledge Base ProductivityGoalTrack the progress of new documents through the knowledge base creation system, both by individual and overall. FrequencyTypically weekly is enough, but busy centers require daily metrics. SampleTable 10.9. Sample Knowledge Base Productivity Metrics
CommentsAn interesting improvement would be to track the timeliness of the reviews by setting targets for each level and reporting achievement against target in this report. See the case productivity report for an example of reporting achievement against response time targets. This productivity report is very similar to other productivity reports for other functions. Customer SatisfactionGoalTrack the level of customer satisfaction. FrequencyTypically weekly is enough, but if you do a lot of surveys each day you may want to go for daily reports. You need to also have an alert mechanism in place to handle particularly good or bad surveys within a business day. SampleTable 10.10. Sample Customer Satisfaction Metrics
CommentsTrack response rates because slow response rates decrease the reliability of the survey results. Note that, despite earlier rants in favor of target achievement percentages, I'm using averages on this report because I find averages work reasonably well for survey ratings. If you prefer, you can use a target average rating (say 8/10) and report achievement against it. Some support managers hesitate to run individual metrics for customer satisfaction. Don't be shy! Individual measurement and targets are essential for customer satisfaction, as differences between individuals can be literally startling. However, be cautious about using results if individuals receive few surveys (this example would be ok, with 40+ surveys returned per individual). With a handful of surveys, one particularly bad ”or good ”result could skew the result, whether you compute averages or achievement against target. Support Financials SummaryGoalGet a snapshot of the financial data so you can make business decisions. FrequencyMonthly (unless you get more frequent financial reports). SampleTable 10.11. Sample Support Financials Summary Metrics
CommentsThis report would be ideal for a fee-based support center. Cost-based centers do not need the revenue-based columns, obviously. I like to see financial figures using a rolling three-month average to smooth out monthly spikes, which are often meaningless. A great improvement to the report would be to add trending information. And graphs would be good too. Finally, the numbers should be broken down by geography or product family for larger centers A similar report could be created for a sales team to track margins and cost of sales. Top 10 CustomersGoalTrack the heavy-usage customers so you can identify both customers who are struggling and customers who may be abusing the system. FrequencyWeekly. SampleTable 10.12. Sample Top 10 Customers Metrics
CommentsDepending on the size of the center, show the top 10 or 20 or 40 customers. I chose to concentrate on new cases opened during the period, but it would also make sense to look for customers with the largest backlogs. This report is particularly useful in centers that serve customers with support contracts. If you have corporate customers, you may want to run this both by contact (individual) and by customer. This would be an interesting report to run on customer sales rather than support usage to show the most important recent customers. Knowledge Base UsageGoalTrack the documents with the heaviest and lowest usage and ratings so you can identify key issues and documents that need to be reworked. You can also use this to reward the authors of particularly successful documents. FrequencyA weekly run should suffice, although a daily run may make sense in busy environments. SampleTable 10.13. Sample Knowledge Base Usage Metrics
CommentsThis shows the top documents, but you should also run the report on the lowest-performing documents. This shows ranking by reading scores, but you can also rank by user rating (or do both). In this example, the first document (#125) is read in almost every search it is found, so the indexing seems to be working well. However, the very low usefulness rating (9%) points to serious flaws in the document. On the other hand, the second document (#111) is showing up in a lot of searches in which it is not read, so its indexing should be improved. |