Chapter 1: The State of Software Testing Today

Overview

The director asked the tester, "So you tested it? It's ready to go to production?

The tester responded, "Yes, I tested it. It's ready to go."

The director asked, "Well, what did you test?"

The tester responded, "I tested it."

In this conversation, I was the tester. It was 1987 and I had just completed my first test assignment on a commercial software system.[1] I had spent six months working with and learning from some very good testers. They were very good at finding bugs, nailing them down, and getting development to fix them. But once you got beyond the bug statistics, the testers didn't seem to have much to go on except it. Happily, the director never asked what exactly it was.

The experience made me resolve to never again be caught with such a poor answer. I could not always be so lucky as to have management that would accept it for an answer. All my training as a structural engineer had prepared me to give my management a much better answer than it.

Suppose the supervisor on a building project asked if I tested the steel superstructure on Floor 34 and needed to know if it was safe to build Floor 35. If I said "yes" and if the supervisor then asked, "What did you test?" I would have a whole checklist of answers on the clipboard in my hand. I would have a list with every bolt connection, the patterns of those connections, the specified torque wrench loading used to test the bolts, and the results from every bolt I had touched. I would know exactly which bolts I had touched because each would be marked with fluorescent paint, both on my chart and on the steel.

Why should software testing be any different? I could certainly give my management a better answer than it. Many of these better answers were around when the pyramids were being built. When I am asked those questions today, my answer sounds something like this:

As per our agreement, we have tested 67 percent of the test inventory. The tests we ran represent the most important tests in the inventory as determined by our joint risk analysis. The bug find rates and the severity composition of the bugs we found were within the expected range. Our bug fix rate is 85 percent.

It has been three weeks since we found a Severity 1 issue. There are currently no known Severity 1 issues open. Fixes for the last Severity 2 issues were regression-tested and approved a week ago. The testers have conducted some additional testing in a couple of the newer modules. Overall, the system seems to be stable.

The load testing has been concluded. The system failed at 90 percent of the design load. The system engineers believe they understand the problem, but they say they will need three months to implement the fix. Projections say the peak load should only be at 75 percent by then. If the actual loading goes above 90 percent, the system will fail.

Our recommendation is to ship on schedule, with the understanding that we have an exposure if the system utilization exceeds the projections before we have a chance to install the previously noted fix.

The thing that I find most amazing is that answers like these are not widely used in the industry today. I regularly hear testers and developers using it metrics. Throughout the 1990s I gave out a survey every time I taught a testing course. Probably 60 percent of the students taking these courses were new to testing, with less than one year as a tester. About 20 percent had from one to five years' experience, and the remainder were expert testers. The survey asked the student to define common testing terms like test, and it asked them to identify the methods and metrics that they regularly used as testers. The complete results of these surveys are presented in the appendix of this book. I will mention some of the highlights here.

  • The only type of metrics used regularly have to do with counting bugs and ranking them by severity. Only a small percentage of respondents measure the bug find rate or the bug fix rate. No other metrics are widely used in development or testing, even among the best-educated and seemingly most competent testers. It can also be inferred from these results that the companies for which these testers work do not have a tradition of measuring their software development or test processes.

  • Few respondents reported using formal methods such as inspection or structured analysis, meaning some documented structured or systematic method of analyzing the test needs of a system. The most commonly cited reason for attending the seminar was to learn some software testing methods.

  • The majority of testers taking the survey (76 percent) had had some experience with automated test tools. Today an even greater percent of testers report that they have used automated test tools, but test automation is also voted as the most difficult test technique to implement and maintain in the test effort.

  • The respondents who are not actively testing provided the most accurate definitions of the testing terms. The people performing the testing supplied the poorest definitions of the testing tasks that they are performing most frequently.

Overall, I believe that the quality of the tester's level of awareness is improving, and, certainly, software testing practices have improved in the commercial software development sector. Today there are more publications, tools, and discussion groups available than there were 10 years ago. There are certainly more shops attempting to use formal methods and testing tools. But the survey results haven't changed much over the years.

How did we get to this mind-set? How did these limitations in perceptions-and, unfortunately, all too often in practice-come about? To understand that, we need to examine the evolution of software development and testing during the last two decades.

[1]Commercial software is software that is commercially available and can be purchased by the public. This distinguishes it from safety-critical, proprietary, or military software.



Software Testing Fundamentals
Software Testing Fundamentals: Methods and Metrics
ISBN: 047143020X
EAN: 2147483647
Year: 2005
Pages: 132

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net