Future Experiment


This coming semester (February 2001) will see the start of the next Software Hut exercise. As usual, three clients will each deal with four to six teams. We plan to divide the teams into two groups, one of which will be given some reinforcement in traditional software design techniques, and the other will get a crash course in XP. We will then monitor the progress of the two cohorts of students, some using XP and others not, as they attempt to build their solutions.

We will do the monitoring by studying the way the students manage their projects. Each team must produce and maintain realistic plans, keep minutes of all their project meetings, and be interviewed weekly. We will also get all their working documents, requirements documents, analysis, test cases, designs, code, and test reports. These will provide many further opportunities for measuring the attributes of their output, ranging from function and object point analysis to bug densities. The XP experiments suggested by Ron Jeffries will be helpful in this respect.[3]

[3] See the XProgramming Web site at http://www.xprogramming.com.

At the end of the semester, the clients will evaluate and mark all the delivered solutions. They use a structured marking scheme that we construct for them, which provides a final level of measurement of how well the solutions performed in terms of usability, installability, functionality, robustness, and so on. These are the key attributes because they will be applicable to all the solutions no matter how they were built. We will use this information in a statistical analysis to see whether there are any significant differences in the quality of the final products between XP and traditional "heavyweight" methods.

Finally, we will require each student to give both a team evaluation and a personal commentary on how the project went, describing the strengths and weakness of what they did and how they did it. In the past, this has been useful in identifying issues and problems with approaches to software development.

After delivery, we will be able to track the performance of the delivered systems to gain further information about their quality in their working environment.

The three clients are as follows.

  • Client A is an organization that brokers waste. A waste exchange provides a facility for industrial companies to offer their waste products to other companies that might be able to reclaim something of value from it. The waste exchange maintains a database of current waste products and arranges for the exchange and payment of deals in waste. The project is to build a Web-based system that interfaces with the existing database and gives clients the opportunity to browse the database.

  • Client B is a small start-up company in the bioinformatics industry requiring software for data analysis. Various new algorithms for processing and analyzing genomic and protonomic data have been developed by the company, and they now require a set of programs that can automatically apply these algorithms to data that is continually being placed on Web sites directly from the scientific experiments.

  • Client C is a legal practice center that provides specialist training for the legal profession. One aspect of this training is postacademic qualifications and deals with the experiential learning related to legal practice in solicitors' offices. The client needs a computerized assessment system to provide a mechanism for tracking and evaluating each student's performance in the course.

The overall arrangements are described in Figure 22.1.

Figure 22.1. The organization of the teams and clients

graphics/22fig01.gif

In all of this, the students will be basing their approach on what they have learned in the course so far. In the first year, they will have taken part in a group project that involves building a small software system specified by the course lecturers. The students do this as one-sixth of their work over the year, and it integrates what they have been taught in formal lectures dealing with requirements and specification, Java programming, and systems analysis and design (essentially UML). This exercise helps them start understanding some of the issues relating to working in teams, keeping accurate records, and producing high-quality documents; some of the problems in dealing with clients (a role played by their academic tutors) and the problems of delivering quality; and the need for thorough review and testing activities.

Before they start on the Software Hut projects, they attend a practical course on teamwork, organized by the university's Careers Services Department.

They will then be split into two cohorts, the XP teams and the Trad (traditional) teams, for further specific training in a methodology and an approach to software construction.

One area that we must address concerns the advice we give about the form of the project plan. The XP-based plans will be very different from the traditional approach, and it will be a new phenomenon for the tutors to be managing a set of projects that are at very different stages at any one time. The students will also compare notes to some extent, and we hope that the teams using XP will be discreet about what they are doing so they do not influence the other teams too much. We have found in the past that the competitive element has minimized this.

Part of this trial run will be learning about the sorts of metrics and data we need to enable us to carry out proper comparisons. We will then be able to run better experiments.



Extreme Programming Perspectives
Extreme Programming Perspectives
ISBN: 0201770059
EAN: 2147483647
Year: 2005
Pages: 445

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net