If the customer on your team is comfortable working with executable acceptance tests, this makes life a lot easier, because you'll be relying on her to provide you with realistic test cases or, at the very least, review the ones you make up. In the best case, you can even teach her to check out and update the tests directly. On the other hand, you may find she isn't disposed to work with the executable format and you need to provide some other format for her to review and update. Spreadsheets are a good alternative format for presenting acceptance tests to a customer unwilling or unable to work with the executable tests. They're easy for just about anyone to maintain and manipulate, and they provide useful organizational features like hypertext linking and the ability to store more than one sheet in a workbook. Since you need to keep the spreadsheet and the executable test in sync, the two formats require a more or less one-to-one correspondence. We've found it feasible use a single spreadsheet workbook to correspond to the class and separate sheets within the workbook to correspond to the methods. In other words, corresponding to our LoginStoryTest.java file would be a spreadsheet file named LoginStoryTest.xls, and corresponding to our testLogin method would be a worksheet with the same name and the contents shown in Table 17.1.
Likewise, for our slightly more complicated TaskStoryTest, we would have a spreadsheet file named TaskStoryTest.xls that would contain a worksheet named testCreate, one named testUpdate, and one named testDisplay. Table 17.2 shows what testCreate might contain.
You should be able to store the spreadsheets in the same source-control system as the executable tests. The customer can review and modify the spreadsheet files, and then you (or some other team member) can make the corresponding changes in the executable tests. If you don't want to keep the files in sync manually, reasonably simple programs can be written to extract the test cases from the executable tests into a format that can be loaded into the spreadsheets or to generate the executable tests from data exported out of them. This can severely limit your ability to refactor the tests, however, and should be undertaken with caution. Well, that about wraps the second day of our road trip. We established the stories for the first iteration, planned and estimated all the tasks, and started writing executable acceptance tests, which we organized into files containing Java classes and methods and possibly corresponding spreadsheets for the customer. We've come a long way, and tomorrow we'll make the final push: finish writing the executable tests, make as many runnable as we can, execute them, report the results, and complete the iteration. Better get a good night's sleep! |