|< Day Day Up >|| |
Given all of the work that a typical IT organization is expected to deliver each year, it is easy to see how even major commitments may be overlooked or misinterpreted in the furious effort to get things done. To avoid this pitfall, IT must clarify its commitments to all concerned. Prior sections of this chapter have shown how this may be done for both ongoing service delivery and project implementations. Next, IT management must ensure compliance with its commitments once made. Here again, the aforementioned processes keep the team appropriately focused. Service level management requires the integration of performance metrics into each SLA, while the ongoing project management process forces the team to relate actual accomplishments to its project plan. During the regular visits of the CRE with his or her customers, service level and project delivery may be raised with the customer to assess his or her current satisfaction with IT performance.
Although each of these activities is important in cementing and maintaining a good working relationship with individual customers, a more comprehensive view is required of how IT services and projects relate to one another. For its part, IT executive management needs a more aggregate view of the unit's performance, especially in larger organizations. To this end, the author has relied on a single, integrated reporting process, called the monthly operations report, to capture key IT accomplishments and performance metrics. Chapter 4 and Chapter 5 will comment on the operations report format in detail. What follows is an introductory overview.
At its name implies, the monthly operations report is a regularly scheduled activity. The document is entirely customer-focused and therefore aligns with both the service level management and project commitment processes. However, the monthly operations report is designed primarily to serve the needs of IT management, keeping customer delivery at the forefront of team's attention and holding IT personnel accountable for their commitments. The report reflects qualitative information from each IT service delivery unit (e.g., help desk, training center, network operations, production services, and so forth) concerning accomplishments and issues during the course of the month. Each accomplishment must be aligned with a customer value, as articulated in SLAs and project commitment documents, if it is to be listed as a deliverable. Each issue must address who was impacted by the product or service failure, as well as how it was resolved. Next, the report lists quantitative performance data, such as system availability, system response time, problem tickets closed, training classes offered, and the like. Note that some of these data points measure activity rather than results and must be balanced with customer satisfaction metrics to be truly meaningful.
To that end, the author has developed and deployed a low-cost system for collecting customer feedback. This simple surveying process is guided by the following operating principles:
First, the process must require no more than two minutes of an individual customer's time
Second, it must be conducted via the phone or face-to-face but never via paper forms or e-mail, although recently available Web-based surveying tools are much less obtrusive and could serve as a surrogate for actual human interaction
Third, it must employ measures of customer satisfaction rather than IT activity
Fourth, at the very least, it must scientifically sample IT customer populations
Fifth, it must be carried out in a consistent manner on a regular basis
Guided by these principles, the author's own project management office team has implemented effective survey tools for many types of IT service.
To initiate these processes, the team has drawn randomly from the help desk customer database, where requests for service and problem tickets are recorded. A single staff member spends the first few days of each month calling customers, employing the appropriate survey scripts. Results are captured in a simple database and then consolidated for the report. These customer satisfaction scores are also tracked longitudinally. Initially, my IT colleagues were skeptical, but now that they appreciate the objectivity of the process, they value the useful information that it generates. For their part, our customers have given us high marks for launching a process that tracks and reports publicly on their satisfaction (or not!) with our services. All the summary data appears in the monthly operations report." 
Project delivery is a little more complicated to capture on a monthly basis because projects do not necessarily lend themselves to quantitative measures or to regular surveying of customer satisfaction. Nevertheless, the operations report contains two sets of documents that IT management finds useful. The first is a project master schedule that lists all current and pending IT projects alphabetically by title, indicating IT ownership (i.e., project director), and interproject dependencies. The schedule also shows the duration of each project and its status: white for completed, green for on schedule, yellow for in trouble but under control, red for in trouble, and purple for pending. Thus, within a few pages, the IT leadership can see all of the discretionary work under way at any given time, what is in trouble, where the bottlenecks are, and who is over committed.
The presentation is simple and visual. Within the operations report, each project has its own scorecard, a single-page representation of that project's status. Like everything else in the report, the scorecard is a monthly snapshot that includes a brief description of the project and its value to the customer, a list of customer and project team participants, this month's accomplishments and issues, a schematic project plan, and a Gantt chart of the current project phase's activities. Like the master schedule, scorecards are coded white, green, yellow, red, or purple as appropriate. (See Chapter 5 for more details and The Hands-On Project Office at http://www.crcpress.com/e_products/downloads/download.asp?cat_no=AU1991 for template examples.) 
The monthly operations report is reviewed each month in an open forum by the IT executive team. Other IT personnel are welcome to attend. Within a two- to three-hour block, the entire IT leadership team has a clear picture of the status and health of all existing IT commitments. Follow-up items raised in the review meeting are recorded and appear in the next version of the report. The document itself is distributed to the entire IT organization via the unit's intranet. As appropriate, sections from the report, as well as individual project scorecards, are shared by the unit's CREs with their respective customers. In brief, the process keeps accomplishments and problems visible and everyone on their toes. Bear in mind, the focus of this process is continuous improvement and the pursuit of excellence in customer delivery. Blame is never individually assessed, because the entire IT team is held accountable for the success of the whole. On the other hand, it is marvelous to observe how much more conscientious IT service and project delivery managers are when they know that they must face their peers each month to report on their progress — or the lack thereof.
In a world of growing complexity, where time is of the essence and the resources required to deploy IT effectively remain constrained, the frameworks and simple tools outlined in this chapter have proved useful to me, and, if adapted to the reader's own work environment, should serve him or her well. The underlying principles for these efforts are simple, and the practices themselves are commonsensical. These remain the keys to success:
A solid focus on customer value
Persistence in the use of standard, rigorously defined but creatively adapted and flexibly executed processes
Quality and continuous communication
A true commitment to collaborative work
The reader will do well to bear this in mind as we dig deeper into the processes mentioned in this chapter.
On the other hand, the picture painted thus far should raise concerns among some readers. The author advocates a series of internal IT management practices that perhaps enjoy no precedent within the reader's IT organization. In short, I am asking you to add still more to your overhead costs. This is true. The recommendations in this book require an investment of time and effort on the part of the IT team. You may not have the right skills on hand to get this particular set of activities accomplished. Furthermore, you may not, in your view, have the bandwidth to add these responsibilities to existing organizational roles. Yet ask yourself the following questions:
What is the quality of your relations with your customers today, and what is the quality of communication between your line-of-business sponsors and IT?
How much scrap and rework do you pursue annually?
How many projects fail?
What is the enterprise's view of IT service delivery?
Is the information technology organization getting the resources it needs to satisfy customer requirements?
It should be clear where I am headed. The cost of operating as I suggest will have an impact on IT resources, but these demands pale in comparison to the risks and costs that you and your team face through ineffective communication and tarnished service and project delivery. The next chapter addresses how IT should configure for successful delivery management, as well as the tools to justify such an investment in people and process change.
This process for collecting customer satisfaction data operated between 2000 and 2003 at Northeastern University, where it was subsequently replaced by a Web-enabled survey process built with the eSurveyor tool. With the introduction of eSurveyor, integrated with the Information Services Division's Remedy problem-tracking system, the PMO now has an automated way to reach all customers, obviating the need to sample trouble ticket callers.
Also see The Hands-On Project Office, http://www.crcpress.com/e_products/downloads/download.asp?cat_no=AU1991, chpt4~6~monthly service delivery report~template, chpt4~7~monthly service delivery report~example, chpt5~15~project scorecard~template, chpt5~17~monthly project status report~template, and chpt5~18~monthly project status report~example.
|< Day Day Up >|| |