Assessing Quality in Iterative Development

   

When we abandoned the waterfall development model earlier in the text, we also abandoned an apparent convenience, that is, the fixed mileposts in development where specific measures could be applied to "complete" artifacts. For example, at the end of the requirements "phase" we could hold a review to inspect requirements and assure that they were complete and unambiguous. At the end of the design phase, we could inspect the design, and so on. It looked so good on paper that we even believed it for a while, but we eventually came to recognize that it just doesn't work that way in real life. Oh well, methods , like time, march constantly forward, so let's look at how we apply quality measures in the context of iterative development. In Chapter 3 we described a process model that was iterative in nature, and we used the diagram shown in Figure 29-1 to convey the key concepts.

Figure 29-1. Iterative development model

graphics/29fig01.gif

As we described in Chapter 3, development progress is measured not by completion of artifacts but by incremental iterations, or a series of "builds" that more objectively demonstrate to ourselves and others the progress we've made in defining and building the system of interest. The delivery of an iteration , even a very early iteration that may not result in any executable or demonstrable code, is by itself a primary quality measure . For we can now ask and answer the following questions, basing our answers on the objective evidence exhibited by the iteration itself.

"Does it do what we said it would do?"

"Does it appear to meet the requirements as we know them at this time?", and

"Did we do it about when we said we would?"

Perhaps even more importantly, even if it does what we said it would do and performs as stated in the requirements:

"Now that you can see a bit of this thing, is this what you really wanted? Is this what you really meant ? "

We can have some of that early feedback, thanks to iterative development, and alter the course of action before additional investment is made. When the team has the ability to ask and answer these questions early and often, the team can be assured that inherent quality processes and measures are built into the process itself.

However, since quality also requires adherence to agreed-upon processes, we can look at the artifacts of the process and inspect them for quality as well. In addition, in some early iterations, it's possible that preliminary artifacts are all that are available for assessment. These artifacts both demonstrate that the process is being followed and provide more tangible , system-oriented work products we can analyze and measure. This provides the assurances that the demonstrated iteration contains inherent quality, that there are no unexpected deviations from plan, and that follow-on activities are building on a solid foundation.

   


Managing Software Requirements[c] A Use Case Approach
Managing Software Requirements[c] A Use Case Approach
ISBN: 032112247X
EAN: N/A
Year: 2003
Pages: 257

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net