15.3 How Well was Scope Implemented?

 < Day Day Up > 



15.3 How Well was Scope Implemented?

This question has three components:

  1. Requirements as a reflection of scope. Suppose the team developed a dozen requirements from scope and delivered them. It is possible that in hindsight you recognize you should have produced two or three more, or that one or more requirements were misidentified if not totally blown. Hopefully, you picked this up along the way and managed to recover, but in big projects this is not easy to do well. If the project was complex enough, it is also possible that a few requirements were either dropped or deferred to Day Two. This would require analysis as well, with a discussion of resources, feasibility, the maturity of technology, original assumptions, and funding as possibly relevant areas to explore. In other words, should they have been requirements to begin with? If requirements were dropped or delivered late, was that a good thing or a bad thing?

  2. Requirements delivered. How well were the requirements fulfilled? Earlier in the book, I strongly urged you to develop success metrics beforehand as a means of honestly being able to declare victory after the fact. It was further recommended that these metrics be "hard" or quantifiable to the extent possible, even though it was acknowledged that anecdotal metrics are the norm.

    The delta between intended and actual results can be most instructive. This may be the one area where you must allow detailed technical conversations, because the cause of less than perfect results may have a technology component. The tricky thing here is that most of us are worldly enough to understand and tolerate the shortcomings or drawbacks of even the biggest name products and services we rely upon to do our jobs. Having said that, if part of the project's shortcomings are directly attributable to known deficiencies in Product X or Service Y, do not let your lessons learned exercise turn into a redesign of that product or service.

    In other words, the point to this process is to look for things you can fix and, unless you are a project manager within a big software, hardware, or integration shop, your chance of repairing things of this nature is nil. Do, however, encourage the team to examine the test, piloting, and risk processes as part of this conversation that specifically addresses how well you delivered. In my experience, this is an area where the professionals, as diligent as they may be, can use some "out of the box" assessment and mentoring. This last statement, by the way, is an observation, not a criticism.

  3. Exposure to perception. From the moment a project is announced to perhaps years beyond its completion, there can be a wide range of interpretations regarding how well it did what it was supposed to do. The debate can be as much about what it was supposed to do, but did not, as about how well it delivered whatever it did. Obviously, if some observers believe you should have done X, Y, and Z, even though your interpretation of scope did not lead to the same conclusion, you may be criticized as having under-delivered or failed.

Although negative press may be deserved, it could also be the result of politically motivated criticism. It may also come from people unfamiliar with your team's valiant struggle to make sure that key deliverables were implemented. Knowing this should help you temper whatever you choose to publish in your lessons learned document.

The scope of a campus project I was associated with was to develop and implement state-of-the-art technology to pilot the next generation buildings this client would occupy. As a result, the original design included some deliverables that it turned out, unfortunately, were too:

  • Immature from a technical perspective to meet production standards

  • Difficult to successfully assemble and integrate within calendar constraints

As project manager, would you dive into the latter scenario from a lessons learned perspective, knowing full well that the significant dollars invested in this piece of the project did not lead to useful products or services? With this one, I was tempted to invoke the "ask me no questions, and I'll tell you know lies" rule. In other words, because harm could come to many for resurrecting this dyspeptic chapter in an otherwise successful project, it seemed expedient to go into denial on this piece of it.

The argument can be made that whoever coined the phrase "Honesty is the best policy" was either incredibly wealthy or terribly reckless. On the other hand, the problem with being disingenuous is that this sort of unpleasant memory is likely to be unearthed when the "recollector" finds it politically expedient. Therefore, whatever you put on paper can be used as a club against you later. Thus, it should be addressed. In a few pages, this particular issue will resurface in an example. You can judge for yourself whether this is the way you might wish to handle your own missed requirements at future postmortems.

When you formulate lessons learned, be sure and capture any rationale that serve as "mitigating circumstances" and that might moderate second-guessing. The more lethal attackers will probably dismiss your rationale as excuses, so do not overdo it. If you had a good reason for shaving scope, clearly and honestly document the rationale and impact, and move on. By the way, there can be times when you can say, with pride, "At least we did not throw good money after bad."



 < Day Day Up > 



Complex IT project management(c) 16 steps to success
Complex IT Project Management: 16 Steps to Success
ISBN: 0849319323
EAN: 2147483647
Year: 2004
Pages: 231
Authors: Peter Schulte

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net