Frequent Builds and Smoke Tests Are Mandatory


Two of the most important pieces of your infrastructure are your build system and your smoke test suite. The build system is what compiles and links your product, and the smoke test suite comprises tests that run your program and verify that it works. Jim McCarthy, in his book, Dynamics of Software Development (Microsoft Press, 1995), called the daily build and smoke test the heartbeat of the product.

Frequent Builds

Your project has to be built every day. That process is the heartbeat of the team, and if you're not building, your project is dead. Many people tell me that they have absolutely huge projects that can't be built every day. Does that mean that those people have projects that are even larger than the 50 million lines of code in the Windows Vista source code tree? Windows Vista is the largest commercial software project in existence, and it builds every day. So there's no excuse for not building every day. Not only must you build every day, but you must have a build that is completely automated.

When building your product, you must build both release and debug versions at the same time. As you'll see later in the chapter, the debug builds are critical. Breaking the build must be treated as a sin. If developers check in code that doesn't compile, they need to pay some sort of penalty to right the wrong. A public flogging might be a little harsh (though not by much), but what has always worked on the teams I've been on is penance in the form of supplying doughnuts to the team and publicly acknowledging the crime. If you're on a team that doesn't have a full-time release engineer, you can punish the build breaker by making him or her responsible for taking care of the build until the next build breaker comes along.

One of the best daily-build practices I've used is to notify the team via e-mail when the build is finished. With an automated nightly build, the first message everyone can look for in the morning is the indication of whether the build failed; if it did, the team can take immediate action to correct it.

To avoid problems with the build, everyone must have the same versions of all build tools and parts. As I mentioned earlier, some teams like to keep the build system in version control to enforce this practice. If you have team members on different versions of the tools, including the service pack levels, you have room for error in the build. Unless there is a compelling reason to have someone using a different version of the compiler, no developer should be upgrading on his or her own. Additionally, everybody must be using the same build script as the build machine to do their builds. That way there's a valid relationship between what developers are developing and what the testers are testing.

Your build system will be pulling the latest master sources from your version control system each time you do a build. Ideally, the developers should also be pulling from version control every day. If it's a large project, developers should be able to get the daily compiled binaries easily to avoid big compilation times on their machines. Nothing is worse than spending time trying to fix a nasty problem only to find out that the problem is related to an older version of a file on a developer's machine. Another advantage of developers pulling frequently is that it helps enforce the mantra of "no build breaks." By pulling frequently, any problem with the master build automatically becomes a problem with every developer's local build. Whereas managers get annoyed when the daily build breaks, developers go ballistic when you break their local build. With the knowledge that breaking the master build means breaking the build for every individual developer, the pressure is on everyone to check only clean code into the master sources.

Wonderful MSBuild

One of the most exciting parts of .NET 2.0 is the inclusion of the fantastic MSBuild system. It's in every install of .NET itself, and with all Visual Studio projects being MSBuild files, we finally have a situation in which you can do exactly the same build script in the IDE as you do on the build server. Having built several industrial-strength build systems in the past, I can only applaud Microsoft for getting it mostly right in the first version. Once we have direct native C++ building and per-CPU builds in MSBuild, we'll have the ultimate in build technology.

With MSBuild a vital tool in your life, you cannot afford to be ignorant of the technology. Some good resources include the MSBuild team blog at http://blogs.msdn.com/msbuild/ and the Channel 9 MSBuild Wiki at http://channel9.msdn.com/wiki/default.aspx/MSBuild.HomePage. For the following discussion, I'm assuming that you've at least read the MSBuild documentation and are familiar with MSBuild terminology.

As part of this book's source code, I put together numerous build tasks that you might find useful in your own development. All the code is in the highly secretive name of Wintellect.Build.Tasks.DLL. Numerous parts of the book's code build with that DLL, so you can look at the main build project for the book's source code in .\Build\Build.proj to get an idea of real usage.

The first set of tasks I built was to handle build versioning. Having done build number versioning on every project I've ever worked on, I wanted to get it done for the last time. I wanted a task that would read in a file, figure out the build number, increment it, and write the file back out. The IncrementBuildNumberTask handles the build numbering. I also threw in the ability to choose between straight incrementing build numbers or to have the build number use the Microsoft Developer Division format of integers (<year><month><day>.<revision>). The .\Shared directory contains the SharedVersion.xml file I used for all the source code.

Having a file that contains a version is nice, but you want to get that version number into your source files through the AssemblyFileVersionAttribute. That's the job of the GenerateAssemblyFileVersionTask. It reads in the version XML file and spits out C#, Visual Basic, and C++/CLI source files that you can include in your projects. All my projects include a linked file called SharedAssemblyFileVersion.CS or SharedAssemblyFileVersion.VB to have the code applied. In the .\Build\Versions.targets, I included a wrapper task, UpdateBuildAndSharedFiles that calls the two tasks to do the work. To use the versioning tasks, you'll need only to create the XML file to hold your version and set up a project file to use Versions.targets.

As I was doing the two versioning tasks, I realized that having the version information could be handy in other tasks also. For example, numerous tools that use a preprocessor approach allow you to override a variable or property on the command line with a different value. For those cases, I wanted to be ready to plug in the version number to apply versioning to projects that don't use .NET source files.

The VersionAwareTask lets you define a property, VersionCommandLinePart, with which you can specify the usual .NET formatting code, {0}, where you want the version information to be plugged in to the tool's command line. You can't use the VersionAwareTask on its own because it's an abstract class.

There are several tasks to help with building installations. The WixCandleTask and WixLightTask are there to run the tools from the Windows Installer XML Tool Set. If you look at the WiX binaries, you'll see that it comes with their own MSBuild tasks, but I started using it before the "Release of the Primes" version when they first appeared. I'll discuss WiX in more detail later in Chapter 4.

One trick that's unique about my WiX tasks is that they are derived from VersionAwareTask. If you look in .\Build\WiX.targets, you'll see that I define a property, WiXVersionCommandLine, that's preset to -dInstallVersion={0}. In your .wxs install files, you can set the Product element, Version attribute to Version="$(var.InstallVersion) to get the build version directly into your resulting .msi file. For an example, look at either of the installs in .\Installs\directory.

If you've ever heard me speak at a conference, you know I'm a huge fan of the new performance and coverage tools in Visual Studio Team Developer Edition and Visual Studio Team Suite Editions. What many people don't realize is that both the coverage and performance tools offer a complete set of command-line tools, so you can do everything without the IDE. In many cases, especially for automated testing, using the command-line tools is much more useful. To make it easier for everyone, I put together several tasks so that you could perform those runs from MSBuild scripts.

The first target, CodeCoverageInstrumentTarget, is a simple wrapper around the command-line instrumentation tool, Vsinstr.exe. The code is all in Coverage.targets and handles instrumenting a binary for code coverage. The target makes it easy to pass in a whole list of assemblies that you want to instrument. In addition, if you define a StrongName property, the CodeCoverageInstrumentTarget will automatically re-sign the instrumented assemblies so that they run. Remember, if a strongly named assembly is changed, it will not run. However, if you re-sign the assembly with Sn.exe, the strong naming utility, that will fix up the signing so the binary will load and execute.

The other two targets in Coverage.targets, StartCoverageMonitorTarget and StopCoverageMonitorTarget, are used to start and stop the code coverage monitor, which is used to collect data from instrumented binaries as they run. The idea of these targets is that you can use them in an automated way to build an automated smoke test that's an integral part of your MSBuild build.

In the two monitor targets, the task that does all of the heavy lifting is VSPerfMonTask, exported from Wintellect.Build.Tasks.DLL. My original intent was to have a simple wrapper target around the Vsperfcmd.exe program. However, whenever I started the program with the built in Exec task, MSBuild hung completely. After quite a bit of head scratching, I finally realized that Vsperfcmd.exe, the control program for performance and coverage data collection, spawns off another program, Vsperfmon.exe, with inherited handles set to true. I quickly whipped up two sample programs the same way and found that the hang on inherited handles is a problem with MSBuild itself.

When I changed my approach to have my simple target start Vsperfmon.exe directly, I was still hanging in Msbuild.exe because Vsperfmon.exe shuts down only when you call Vsperfcmd.exe with the shutdown command. I'm not exactly sure why the coverage and performance monitors are going through all these odd gyrations, but that's how they work.

It finally dawned on me that I was going to have to write some code to work around the hanging from inherited handles and to figure out a way to get Vsperfmon.exe started but allow MSBuild to continue execution. My VSPerfMonTask tricks MSBuild by starting Vsperfcmd.exe, which avoids the inherited handles problem. Since Vsperfcmd.exe can start Vsperfmon.exe, I'm also skipping the hang waiting for the Vsperfmon.exe process. It's a little roundabout, but I met my goal of having targets that allow you to get code coverage information as part of a build.

If having a way to instrument your binaries in addition to starting the coverage monitor is nice, there needed to be a way to run the tests that use those instrumented binaries from an MSBuild project. The Visual Studio Team Developer Edition, Visual Studio Team Tester Edition, and Visual Studio Team Suite Editions, come with a very nice tool called MSTest.exe, which can execute all test types from the command line.

The first of the two testing-related tasks in Wintellect.Build.Tasks.DLL is a task called MSTestTask, which wraps the MSTest.exe so you can easily use if from your MSBuild projects. There's quite a bit of work inside MSTestTask to handle an odd issue in MSTest.exe related to the output file-naming scheme because it can be changed by a run configuration file even if you specify a specific name. Consequently, I had to do more work than necessary to have MSTestTask figure out file names to keep the same behavior.

The better part of the testing execution code I've provided is in .\Build\RunTests.targets. The Microsoft testing tools are nice, but they assume that you're going to manually update a file containing your lists of tests (a .vsmdi file) whenever you add a new test. As we all know, any time you have to ask a developer to do something manually, there is a chance it won't get done. What I wanted was an automatic MSBuild project that would start at a specific directory, automatically find all the test code, and then execute any tests in them. That's what RunTest.targets is designed to do. You can look at the RunTests.targets file itself for how to use it. It's also a nice example of how you can use the very powerful files searching with exclude items in MSBuild to control exactly what files are found in a search.

The next set of MSBuild tasks are all about version control. As I started this book, Microsoft had not released the Team Foundation System, so I started with Visual SourceSafe as my version control tool. The SourceSafeTask derives from VersionAwareTask and is a thin wrapper around Ss.exe, the Visual SourceSafe command-line tool. You can look at .\Build\SourceSafe.targets and .\Build\Build.proj to see how I used it to check out and check in the shared version files automatically. The main functionality provided by these SourceSafe tasks are to check in and out my version number files in addition to providing automatic labeling.

The more interesting set of version control tasks are the SourceIndexTask and the VssSourceIndexTask. Later in this chapter, I'll be discussing in depth the great importance of source indexing with the Source Server. Because Source Indexing is a build-time activity, automating it is a perfect candidate for a set of MSBuild tasks. I'll discuss these two tasks in detail in the "Set Up a Source Server" section near the end of this chapter because they make sense only if you know the details of the Source Server tools.

Finally, there are a couple of other interesting tasks and targets that are part of the source code. To build the ReadMe.chm file, I wrote HhcTask to make running the HTML Help Compiler easy from a build. In .\Build\CleanUp.targets, there are three useful targets to keep your source trees clean and sanitary. CorrectClean is intended as a replacement for the standard Clean target supplied by Microsoft. The problem with Microsoft's Clean target is that it not only removes the intermediate files, it also removes all the output files. In the normal development world, that's what's called Really Cleanthat has been the name of that type of clean since before I started working with software. My CorrectClean will remove all the intermediate files but keep the output files. Because I want to be able to install the compiled binaries as part of my installation but not install all the temporary files in those OBJ directories, CorrectClean is a big help.

To clean up all the extraneous files created by running Code Analysis and programs such as the *.Vshosts.exe files, you can use the CrudCleaner task. The last scrubber task in .\Build\CleanUp.targets is RemoveAllDevelopmentTests, which removes all the directories and files created when using the Visual Studio testing tools inside the IDE to run your unit tests. When doing Test Driven Development with the Visual Studio testing tools, the run files can chew up a considerable amount of disk space. If you have saved off the key test results, you can use the RemoveAllDevelopmentTests to get rid of all the unnecessary run files in a single statement in an MSBuild project file.

In addition to the MSBuild tasks, there are two other collections of tasks that will definitely make your life easier. The first is the open source MSBuild Community Tasks Project at http://msbuildtasks.tigris.org/, which offers complete tasks for Subversion among other things. The second is the huge Microsoft Services (UK) Enterprise Solutions Build Framework collection put together by Andy Reeves and friends. It has tasks to create a Microsoft Active Directory account, control SQL Server 2005, and configure a Microsoft BizTalk Server. You can find the SDC collection at http://www.gotdotnet.com/codegallery/codegallery.aspx?id=b4d6499f-0020-4771-a305-c156498db75e.

Writing Your Own MSBuild Tasks

You should never hesitate to look at writing a custom task or target for any tools you have. Of course, a good Google search might locate the code to the task you need, so use existing code first. However, every company I've ever worked for or consulted for had several special tools that had to run as part of their build. Now that .NET is starting to standardize on MSBuild, your tasks can be utilized by others in your company or team very easily.

If you don't want to write any .NET code, you can use the Exec task to wrap the call to your tool. The Exec task provides a property, ExitCode, which you set to indicate proper tool execution so you can wrap nearly any command-line tool. The drawback to the Exec task is that you can't gather output from the tool and parse it up to provide better error or warning messages. If you want to look at an example of using the Exec task, look at CodeCoverageInstrumentTarget in .\Build\Coverage.targets, where I use the Exec task to start the Vsinstr.exe program to perform coverage instrumentation.

The next step to maximizing your use of MSBuild is to write your tasks in the .NET language of your choice. The MSBuild documentation is quite good, and there are a sufficient number of useful samples available on the Internet to get you going. Regrettably, I haven't seen any discussion of one of the better tricks I've found in writing MSBuild tasks, deriving your task from the excellent ToolTask class from Microsoft.Build.Utilities.dll, which is in the Framework directory. The ToolTask wraps up much of the common functionality every task will do, so you don't have to derive directly from the ITask interface.

ToolTask is an abstract class, and the two methods you must always provide are ToolName, which returns the program name of the tool, and GenerateFullPathToTool. Nothing in the documentation on building your own tasks discusses the fact that Msbuild.exe absolutely has to have the full path to your tool. Even if your tool is in the path, Msbuild.exe will still fail the task because it does no path searching. You'll provide an implementation of the GenerateFullPathToTool method to give that information to Msbuild.exe.

One item missing from the Framework Class Library (FCL) is the wrapper around SearchPath, the Windows API that does all the work of finding a file given the PATH environment variable. You'll need to make the PInvoke call to the method yourself. Copy the Native Methods.CS file from my .\Wintellect.Build.Tasks directory, and you'll have the code you need to find that program.

Another area that will trip up your task is the 32-bit/64-bit divide. One of the first tasks I wrote was the HhcTask, which handles the HTML Help Compiler. Because the HTML Help Compiler is installed by Visual Studio into a fixed directory, C:\Program Files\HTML Help Workshop, I blindly assumed that if I simply drop the hard-coded path into my task, I'd be successful. In fact, the HhcTask worked great on the 32-bit machine I was testing.

If you've ever looked at a 64-bit machine, you might have noticed there is something odd when you look for the Program Files directory; there are actually two. There's C:\Program Files, but there's also C:\Program Files (x86). The "(x86)" should be the giveaway that this directory has something to do with 32-bit programs, and that's where the system does put all 32-bit programs. All 64-bit programs go into C:\Program Files. If the tool that you're working with has both 32-bit and 64-bit variants, and it's always installed into the C:\Program Files directory, you'll have no trouble hard-coding the path.

Alas, in the case of the HTML Help Compiler, there's no 64-bit variety, so the first time I ran my build using my HhcTask on a 64-bit system, I got an error indicating that it couldn't find the program to execute. Consequently, you can look at the code for HhcTask and see that I'm checking first in the C:\Program Files directory, and if I don't find Hhc.exe, I look in the C:\Program Files (x86) directory.

In the case of HTML Help Workshop, you can get away with the hard-coded path because you have no choice about where the help compiler is installed. For the MSTestTask, I faced a different problem. Because you do have a choice where Visual Studio is installed, I can't hard-code C:\Program Files\Microsoft Visual Studio\and_the_40-level_deep_directory_structure\Mstest.exe. My hunt brings up the second issue with the 32/64 divide: the registry.

As I've mentioned, 32-bit applications that install on Win64 will go to a different Program Files directory, but their registry keys also go to a different place. For example, to find where Visual Studio is installed, on a 32-bit machine, you look in the HKEY_LOCAL_MACHINE \SOFTWARE\Microsoft\VisualStudio\8.0 registry key. In addition, if on a Win64 machine, you execute the 64-bit version of Msbuild.exe, your registry lookup works just fine. However, if your task runs under the 32-bit version of Msbuild.exe, the registry key is actually HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\8.0. Notice the Wow6432Node in the middle of the key.

Neither of these two issues is major, but I wanted to mention them to aid you in creating tasks that successfully find the executable files that you need to run. Look at my MSTestTask for an example of properly looking for registry values that will work no matter what operating system your task runs on.

What makes the ToolTask so nice is that it already knows how to do the tool execution, and it has a very well-thought-out API that you can take advantage of so that it's even easier to write your task. With most tools, you'll want to provide some sort of parameter validation, and that's what the virtual ToolTask.ValidateParameters method is all about. If you return false from your override, MSBuild will report a failure, and the build will be stopped.

In addition to controlling the execution, you'll want to take advantage of the excellent internationalization support in ToolTask (inherited from the Task base class) by setting the TaskResources property to your own ResourceManager. That way you can use the Log property on ToolTask to get the corresponding TaskLoggingHelper instance and call its LogErrorFromResources to properly report any error. There are numerous other methods on the TaskLoggingHelper returned by the Log property so that you can report all sorts of information based on the logging level and your particular desires.

To build the command lines to tools, Microsoft.Build.Utilities.dll has a very nice public class called CommandLineBuilder. It's a smart wrapper around a StringBuilder that will append the particular switch only if its value is null/Nothing. The CommandLineBuilder class is a bit rudimentary, so I extended it with the ExtendedCommandLineBuilder in Wintellect.Build.Tasks .DLL to add even more smarts. The idea is that no matter which particular switch you want to use, you can simply use the ExtendedCommandLineBuilder and one of its methods without resorting to fancy parsing or analysis.

In your ToolTask derived class, you have GenerateCommandLines, which is the perfect method to build your command lines. That's where you'll use the ExtendedCommandLineBuilder to do all the major work. The ToolTask also supports methods to build response files, which are commands in a text file, so you have the ultimate in flexibility for getting options passed to the tool you're wrapping.

I've already mentioned the Execute method, which as you can guess, starts the tool. What's great about the ToolTask.Execute method is that it calls the ValidateParameters and GenerateCommandLineCommands methods, so you need only build up the tool command line. The only time you'll need to override Execute is when you do special processing. For example, any time you deal with the Visual SourceSafe command-line tool, Ss.exe, you need to set the SSDIR environment variable to point to the database to use. For what must be insane historical reasons, Visual SourceSafe will not let you specify the database to use on the Ss.exe command line. In tasks such as VssSourceIndex, I overrode Execute to let me set the SSDIR environment variable before calling the base Execute class.

The final method from ToolTask that I want to mention is the excellent LogEventsFromTextOutput. When writing tasks for a tool, you usually need to look at the output to determine if any errors occurred. If you were writing your tasks directly from the raw ITask interface, you could provide an Execute method that redirects output to a file and perform some tough parsing to determine if there's an error.

The ToolTask.LogEventsFromTextOutput method is the result of ToolTask handling all tool output line by line and calling this method on each line. By overriding the LogEventsFromTextOutput method, you can grab the output as it's coming through and determine if there have been any problems. For an excellent example of making your life drastically simpler by overriding LogEventsFromTextOutput, see the SourceIndexTask in Wintellect.Build.Tasks.DLL.

Now that you have a good idea of the possibilities of MSBuild and how easy it is to write tasks, go forth and task away! You can easily automate your entire build from the physical build all the way through smoke testing. To see an example, look at the work done in the book's main build file in .\Build\Build.proj. I think it's a testament to the skill of the developers who designed MSBuild as to how easy it was for me to put together that complicated of a build with just a little bit of work.

Smoke Tests

In case you're not familiar with the term, a smoke test is a test that checks your product's basic functionality. The term comes from the electronics industry. At some point in a product's life cycle, electronics engineers would plug in their product to see whether it smoked (literally). If it didn't smoke, or worse, catch fire, they were making progress. In most software situations, a smoke test is simply a run-through of the product to see whether it runs and is therefore good enough to start testing seriously. A smoke test is your gauge of the baseline health of the code.

Your smoke test is just a checklist of items that your release build program can handle. Initially, start out small: install the application, start it, and shut it down. As you progress through the development cycle, your smoke test needs to grow to exercise new features of the product. The best rule of thumb is that the smoke test should contain at least one test for every feature and major component of the product. If you are in a shrink-wrap company, that means testing each feature that appears in a bullet point for your advertisements. In an IT shop, that means testing each of the major features you promised the CIO and your client. Keep in mind that your smoke test doesn't need to exhaustively test every code path in your program, but you do want it to judge whether you can handle the basics. Once your program passes the smoke test, the quality engineers can start doing the hard work of trying to break the program in new and unique ways.

One vital component of your smoke test is some sort of performance benchmark. Many people forget to include these and pay the price later in the development cycle. If you have an established benchmark for an operation (for example, how long the last version of the product took to run), you can define failure as a current run that is 10 percent or more over or under your benchmark. I'm always amazed by how many times a small change in an innocuous place can have a detrimental impact on performance. By monitoring performance throughout the development cycle, you can fix performance problems before they get out of hand.

The ideal situation for a smoke test is one in which your program is automated so that it can run without requiring any user interaction. The tool you use to automate the input and operations on your application is called a regression-testing tool. Unfortunately, you can't always automate every feature, especially when the user interface is in a state of flux. If you have Visual Studio Team Tester Edition or Visual Studio Team Suite, you can use the excellent WebTest test type to automate and validate any HTTP-based application.

For automating rich client applications, such as a Windows Formsbased product, you'll need to turn to a user interface automation tool, such as Mercury WinRunner (http://www.mercury.com/us/products/quality-center/functional-testing/winrunner/). Another alternative is IBM's Rational Robot (http://www-306.ibm.com/software/awdtools/tester/robot/index.html). Unfortunately, these industrial-strength user interface automation tools are extremely expensive. If you are doing Windows Forms applications, James McCaffrey offered up a quick way to automate the user interface in his January 2003 Test column in MSDN Magazine (http://msdn.microsoft.com/msdnmag/issues/03/01/UITestAutomation/).

Breaking the smoke test should be as serious a crime as breaking the build. It takes more effort to create a smoke test, and no developer should treat it lightly. Because the smoke test is what tells your QA team that they have a build that's good enough to work on, keeping the smoke test running is mandatory. If you have an automated smoke test, you should also consider having the smoke test available for the developers so that they can use it also to help automate their testing. Additionally, with an automated smoke test, you should have the daily build start the smoke test so that you can immediately gauge the health of the build. As with the daily build, you should notify the team via e-mail to let them know whether the smoke test succeeded or failed.




Debugging Microsoft  .NET 2.0 Applications
Debugging Microsoft .NET 2.0 Applications
ISBN: 0735622027
EAN: 2147483647
Year: 2006
Pages: 99
Authors: John Robbins

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net