9.11 IDENTIFY AVAILABLE DATA SOURCES

 < Day Day Up > 



9.11 IDENTIFY AVAILABLE DATA SOURCES

We can think about prioritizing the various requirements we have collected essentially by how easy they will be to satisfy. Remember that business requirements and advice from the Metrics Coordination Group may take precedence over what has been considered to be important by either the organization in the past or by you, but available data sources should be identified in any case as it is likely that they will assist you in satisfying current requirements.

The first thing to do is to look at what is currently available in terms of data and where this data resides. To do this you will need to talk to people yet again and this is another case where the Metrics Coordination Group can help. You can also draw on your own experience and the experience of other contacts within the organization. Beware! It is very easy to think that the way you have done things in the past is also the way that every one else does things — but this is not always the case. I did have one example where I built a model for capturing defect data using what I believed to be a standard tracking system used by the organization only to find that this process was only used by one third of the organization — a classic case of not checking the facts thoroughly enough. Fortunately I had only wasted a couple of days but the situation could have been very costly and embarrassing. Imagine presenting a new system to a group of senior managers only to find that they had never heard of the process that lay at the core of your proposal!

You will find certain data collection mechanisms already in place within an organization although it must be said that the validity of this data may well be suspect. Typically an organization will collect data relating to field defects, that is as reported by the customer or the end user. This can be a useful source of information. The other most commonly collected form of data relates to time spent by development staff. Many organizations have little faith in their time recording systems!

There are various reasons for this. Often time data is collected weekly or monthly. Personally, I find it difficult enough to remember what I was doing during the morning if I do not record that information until the afternoon. Completing a timesheet at the end of the week or, even worse, at the end of the month usually means that the information is very inaccurate. Having said that, things do tend to even out. One organization I know of managed to set up an experiment that compared recorded time to actual time spent on a number of specific projects. Their findings were interesting in that they did indeed verify that the recorded time was terribly inaccurate, yet it was consistently inaccurate across the sample. The recorded time was approximately 30% less than the actual time for any one project.

Defect data tends to be less inaccurate in terms of absolute numbers. The problem with defect data tends to lie with inaccuracy in detail or information that you would like to be present simply not being collected. For instance, in one project I observed the number of defects that were attributed to coding faults was very high. This is a common occurrence but equally common, and the case on this project, was the fact that after investigating these defects more carefully I found that coding faults only accounted for less than half those originally reported as such. The rest were design or requirements faults. This can have fairly serious consequences as the data suggested the coding phase of development should be investigated when the reality was that the design phase was by far the more costly in terms of injected faults. This experience was duplicated by other individuals in the same and different organizations. In fact, this experience has happened so frequently that I would always be suspicious of such data.

This lack of care in recording information seems to be a direct result of the lack of feedback from these established systems. People record the information because they have to, because it is part of the organizational bureaucracy. Having recorded the information they never see or hear of it again, so why should they bother to take care? This observation led me to coin the phrase, "write-only database." A ludicrous situation in anyone's view, but one that unfortunately occurs with distressing frequency.

The problem with missing data seems to arise from design by committee and a distinct lack of effective requirements analysis when these systems are set up. Try getting people to ask the question "why are we collecting defect or time data?" rather than "what data should we collect?"

You may be tempted to put completely new systems in place but take care. Developing completely new systems from the ground up for a large organization can be very costly. It is often easier to change something that already exists than to start from scratch.

At this point you should be noting the type of information that is available and relating this to the various requirements that you have identified. This will give you an indication of what is involved in satisfying those requirements and will help you to prioritize them. For example, it is easier to satisfy a requirement for information about productivity if you already have a reasonable — and please note the use of the word reasonable — effort-recording system already in place.



 < Day Day Up > 



Software Metrics. Best Practices for Successful It Management
Software Metrics: Best Practices for Successful IT Management
ISBN: 1931332266
EAN: 2147483647
Year: 2003
Pages: 151
Authors: Paul Goodman

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net