You can't build a testable version of your game by simply grabbing the latest source code and launching the compiler. Most games have multiple gigabytes of data, install programs, multiple languages, special tools, and all manner of components that have nothing at all to do with the executable. All of these components come together in one way or another during the build. Every shred of code and data must make it onto the install image on one or more CDs or on the network for the test team. Frequently, these components don't come together without a fight. Building the game is something of a black art, assigned to the most senior code shamans.
Ultima VIII had a build process that was truly insane. It went something like this:
Grab the latest source code: editor, game, and game scripts.
Build the game editor.
Run the game editor and execute a special command that nuked the local game data files and grab the latest ones from the shared network drive.
Build the game.
Run the UNK compiler (Ultima's game scripting language) to compile and link the game scripts for English. Don't ask me what UNK stands for...
Run the UNK compiler twice more, and compile the French and German game scripts.
Run the game and test it. Watch it break and loop back to Step 1, until the game finally works.
Copy the game and all the game data into a temp directory.
Compress the game data files.
Build the install program.
Copy the English, French, and German install images to 24 floppy disks.
Copy the CD-ROM image to the network. (The only CD burner was on the first floor.)
Go to the first floor media lab and make three copies of each install: 72 floppy disks and three CDs. Then, hope like hell there are enough floppy disks.
Before you ask, I'll just tell you that the fact that the build process for Ultima VIII had thirteen steps never sat very well with me. Each step generally failed at least twice for some dumb reason, which made building Ultima VIII no less than a four hour process—on a good day.
The build was actually fairly automated with batch files. The game editor even accepted command line parameters to perform the task of grabbing the latest map and other game data. Even so, building Ultima VIII was so difficult and fraught with error that I was the only person that ever successfully built a testable version of the game. That wasn't an accomplishment; it was a failure.
On one of my trips to Microsoft I learned something about how they build Office. The build process is completely automatic. The build lab for Office has a fleet of servers that build every version of Office in every language, and they never stop. The moment a build is complete they start again, constantly looking for compile errors introduced by someone in the last few minutes. Office is a huge piece of software. If Microsoft can automate a build as big and complex as this, surely you can automate yours.
My experience has taught me that every project can and should have an automatic build. No exceptions. It's far easier (and safer) to maintain build scripts that automate the process instead of relying on a witchdoctor. My suggestion is that you should try to create Microsoft's build lab in miniature on your own project. Here is what's needed:
Create a build machine.
Find good tools for automatic building.
Invest time up front creating automation scripts (and make sure you maintain them as your project progresses).
Don't try to save a buck and use a programmer's development box as your build machine. Programmers are always downloading funky software, making operating system patches, and installing third-party development tools that suit their needs and style. A build machine should be a pristine environment that has known versions and updates for each piece of software: the operating system, compiler, internal tools, install program, and anything else used to build the game.
Best Practice | A complete backup of the build machine is good insurance. If you need to build an old project, the backup of the build machine will have the right versions of the compiler, operating system, and other tools. New versions and patches come out often, and even a project just twelve months old can be impossible to build, even if the source code is readily available in the source code repository. Just try to build something ten or twelve years old and you'll see what I mean. If anyone out there has a good copy of Turbo Pascal and IBM DOS 3.3, let me know! |
The build machine should be extremely fast, have loads of RAM, and have a nice hard disk—preferably multiple hard disks. Compiling is RAM and hard disk intensive, so try to get the penny pinchers to buy a nice system. If you ever used the argument about how much money your company could save by buying fast computers for the programmers, imagine how easy it will be to buy a nice build machine. The entire test team might have to wait on a build; how much is that worth?
Automated builds have been around as long as there have been makefiles and command line compilers. I admit that I've never been good at the cryptic syntax of makefiles, which is one reason I've put off automating builds. If you use Visual Studio, you might consider using the pre-build or post-build settings to run some custom batch files or makefiles. I wouldn't, and here's why: You'll force every programmer to run the build scripts every time they build. That's probably wasteful at best, completely incorrect at worst.
Pre-build and post-build steps should run batch files, makefiles, or other utilities that are required every time the project is built. Build scripts tend to extract and prepare game executables and data for the test team. As an example, the build script will always grab the latest code from the source repository and rebuild the entire project from scratch. If you forced every programmer to do that for every compile, they'd lynch you.
Batch files and makefiles are perfectly fine solutions for any build script you need. There are some better tools for those like myself who like GUIs, such as Visual Build Pro from Kinook Software (see Figure 4.2).
Figure 4.2: Visual Build Pro from Kinook Software.
This tool is better than batch files or makefiles because you can understand a complicated build process with failure steps and macros in a snap. The build script is hierarchical, each group possibly taking different steps if a component of the build fails. Visual Build also integrates cleanly with a wide variety of development tools and source code repositories.
Whatever scripting tool you use, make sure they can run from the command line. If you create internal tools to edit or analyze map data, run a proprietary compression technology, or a related task, your tool must be able to take input from the command line, or you won't be able to automate your build process.