The Windows Hardware Quality Lab
Microsoft really wants hardware devices and their associated drivers to meet certain minimum standards for quality, interoperability, and ease of use for consumers. To that end, Microsoft established the Windows Hardware Quality Lab (WHQL) in 1996. WHQL s basic mandate is to publish and administer an evolving set of Hardware Compatibility Tests for systems and peripherals. Successfully passing these tests confers three basic benefits:
Entr e to various marketing programs, such as the Designed for Windows logo and various lists maintained by Microsoft.
A digital signature for your driver package, which greatly eases installation on end user machines.
Free distribution of your driver through Windows Update and other means.
Your starting point for working with WHQL is http://www.microsoft.com/hwdq/hwtest. As I remarked in Chapter 1, it s important to get started early in a development project because there are a number of legal and business hurdles to surmount before you even get to the point of asking WHQL to test your hardware and software. Once past those hurdles, you will need to acquire systems and hardware that you wouldn t necessarily need to own except that you need them for some of the prescribed tests for your class of device.
Running the Hardware Compatibility Tests
When you re ready to begin the WHQL certification process, you ll start by running the relevant Hardware Compatibility Tests. Microsoft distributes the HCT as part of certain MSDN subscriptions and beta programs. You can also download the tests over the Internet. The installation wizard allows you to pick one or more categories of test to perform. If you re just testing the driver for one device, I recommend that you select just the one test category that includes your device in order to minimize the number of meaningless choices that you might have to make later. For example, if you install tests for Pointing And Drawing Devices and Smart Card Readers, you ll have to pick one device in each category before the Test Manager will let you begin testing in either category.
To provide a concrete example, I decided to run the tests for a USB gaming mouse for which I wrote the driver. Figure 15-14 shows how I selected the relevant test category while installing the HCT.
Figure 15-14. Selecting a test category.
The HCT setup program automatically kicks off a wizard that allows you to select a device for test. A dialog box reminds you that you need to have the hardware and software installed at this point. For each of the categories you installed, you ll fill in a dialog box like the one shown in Figure 15-15. My device appears as a HID-compliant mouse. Microsoft is listed as the manufacturer because HCT thinks that my device uses the standard MOUHID.SYS driver. In fact, my mouse is a nonstandard HIDCLASS minidriver with many features that need to be tested beyond the basic things that HCT will test.
Figure 15-15. Selecting a device for testing.
HCT presents several additional dialog boxes before it gets to the point where testing can begin. Figure 15-16 illustrates the basic test manager dialog box. Ideally, you would just press the button labeled Add Not Run Tests, which would populate the right-hand pane with all of the tests. A bit of circumspection is called for here, however.
Figure 15-16. The Test Manager dialog box.
One of the tests the ACPI stress test runs for many hours, if it runs at all. Many computers can t run this test, and the laptop on which I was doing this testing is one of them. To run this test, you need XP Professional on a desktop system that supports the S1 and S3 states or a notebook that supports S1 or S3. (I was using Windows XP Home Edition because USB wake-up stopped working on the notebook if I upgraded, and USB wake-up testing was the only reason I bought that particular notebook.) I suspect that I ll never own a computer that can run this test because I tend to buy computers with the operating system preinstalled and then upgrade the operating system as part of a beta program, whereupon power management stops working.
The USB Manual Interoperability test requires several hundred dollars worth of multimedia hardware that I would have no use for beyond running this one test suite. (Figure 15-17 is a screen shot from a test run when I made the mistake of allowing this test to commence.) This test is pretty important from the hardware point of view because it verifies that commonly used USB devices will continue to work with your device plugged in and vice versa.
Figure 15-17. Required hardware topology for the USB Manual Interoperability test.
Others of the tests are actually useless for telling me anything about the quality of my own driver. The DirectInput Mouse test verifies that Microsoft s drivers interact correctly with DirectInput, a fact I never doubted. The USB Selective Suspend test isn t currently very important for a HID device because HIDCLASS never suspends a device in the first place: most devices can t wake up without losing an input event. In fact, all of the automated USB tests relate to hardware issues. I decided to let them run in this particular example because I was working closely at the time with a leading firmware engineer in getting this product to market. When I was done selecting the tests that I expected to be able to perform whether or not they would succeed was a different question, to which I actually wanted the answer my Test Manager dialog box looked as shown in Figure 15-18.
Figure 15-18. I m ready to start testing
The very first thing the test engine does is engage the driver verifier on the wrong drivers and reboot the computer. Remember that HCT thinks MOUHID.SYS is the driver for my mouse. In reality, the verifier should be getting turned on for my minidriver instead. Attempting to do that by hand would invalidate the test run, though, so I allowed the test run to continue. I m told that newer versions of the HCT will do a better job of identifying which driver needs to be tested. I later ran tests with the verifier turned on for my driver. It was a good thing I did because I caught a rookie mistake in the way my HID minidriver was forwarding a device IRP_MJ_POWER with the minor function code IRP_MN_SET_POWER and the power type SystemPowerState after waiting for its interrupt-endpoint polling interrupt request packet (IRP) to finish.
The Mouse Functionality test (see Figure 15-19) is the one most relevant to the quality of my driver in that it verifies whether I am actually delivering mouse reports in the format expected by the system. Because my mouse lacks an actual wheel (users can program some of its buttons to act as a wheel), I had to fudge part of the functionality test with another mouse attached to the same system.
Figure 15-19. The Mouse Functionality test.
The Public Import and Signability tests both asked whether my product installs it s [sic] own driver. I answered that it does and pointed the test engine to a directory where I had placed my INF and all the other files that get installed on any platform. The import test verified that my driver wasn t calling any verboten kernel-mode functions. The signability test verified, among other things, that all files copied by my INF file were in fact present. (Recall that the CHKINF doesn t do this.)
The CHKINF test ran CHKINF on the wrong INF file, namely the Microsoft-supplied INPUT.INF. Being a good citizen, I ran CHKINF myself. The PERL test script initially failed because it lacked a copy of STRICT.PM, which I found in the HCT directory and copied by hand. The test report told me that a RunOnce entry running CONTROL.EXE (my solution to a client request to automatically launch their control panel) was not allowed because it didn t involve RUNDLL32. Since I had always regarded that particular client request as a bad idea, I resolved to use the test failure as a lever to get my client to change his mind. Mind you, I m sure I could have thought of a way to use RUNDLL32 to launch a control panel applet, but doing that would defeat the real but unstated goal of the test, which is to make sure that a server-side install can proceed without the intrusion of user-interface elements.
The remainder of the tests I scheduled happened without my needing to intervene, which is why I guess they re called automated tests. In the end, I got the test log shown in Figure 15-20.
The reason that the Enable/Disable test failed to generate a log is that it generated an exception in user mode. Some part of the test engine caught the exception and silently terminated that test.
Figure 15-20. Test results after running selected tests.
I worked with my firmware engineer colleague to iron out the failures in the various USB tests. In doing this, it would have been very helpful to correlate test failures with the HCT documentation entries for the same tests. For example, the USB Address Description test log referred to a test assertion numbered 9.22.6. After opening the HCT 10.0 documentation from the Start menu, I browsed to the section labeled Resources/WHQL Test Specification/Chapter 9 USB Test Specification/USB Test Assertions/Address Test Assertions, where I found the information shown in Figure 15-21. Test assertion number 9.22.6 is, uh, well, something important, probably.
Figure 15-21. Documentation for test assertions.
You ll notice that many things went wrong in the testing process. To summarize:
I couldn t run some of the tests because of hardware or budget limitations. I wouldn t be able to put together a WHQL submission for my client. As it turns out, he doesn t have the resources either and will have to hire an outside contractor who specializes in WHQL testing. He doesn t actually want a logo, though. In fact, the counterculture he sells into would prefer that his mouse not have a logo. He needs a digital signature, though, because of the driver ranking problem discussed earlier in this chapter.
One of the tests failed on its own for unguessable reasons.
A few of the tests were testing the wrong thing.
A few of the failed test assertions I encountered weren t documented.
What you would do in a similar situation is ask for help. WHQL personnel monitor several newsgroups on the msnews.microsoft.com news server, including microsoft.public.development.device.drivers and microsoft.public.windowsxp.winlogo. WHQL also responds to e-mail requests for assistance at addresses accessible from the WHQL home page, http://www.microsoft.com/hwdq/hwtest.
Submitting a Driver Package
The last step in running the Hardware Compatibility Tests would be to create a WHQL submission package. You ll want to do this separately for each operating system that your driver supports and then gather together the resulting CAB files in one convenient place. Your next step, which I think you should actually have performed months prior, would be to visit http://winqual.microsoft.com and get yourself signed up as a WHQL client company.
Given a login ID and a password, you can log on through the winqual page to do any of several things:
You can create a new submission package.
You can review the status of a previous submission.
You can retrieve error reports that users worldwide have submitted that apparently arise from your product.
For this chapter, I wanted to create a new submission for a new hardware device. Figure 15-22 is a screen shot showing the starting point for a brand-new submission.
Figure 15-22. Initial screen for a new WHQL submission.
From the point shown in Figure 15-22, Web forms lead through the process of characterizing your submission in a relatively painless way. You ll answer questions such as these:
What kind of product is it? I said my product was in the Input/Pointing Drawing class, which is the same as the test category I used when I was running the HCT.
With which operating systems will the product be used? You want to be sure you ve run the relevant HCT on all the platforms you select because you ll later have to identify the test results for each of them.
What are two e-mail contacts (including yourself) for communications related to the submission? I m not sure what you do if you re a one-person company. (I was doing my testing as a nominal member of a dummy company that the WHQL folks use for their own internal testing, so I didn t have a chance to see how this particular problem would be resolved.)
What, exactly, is your product? (See Figure 15-23.)
Figure 15-23. Detailing the product.
What is the name of your product, when will it be released, and which platforms are supported? There are rules about what constitutes an acceptable product name too. I could not have said just mouse. I entered a description that included the manufacturer s model number and generic description ( rotary gaming mouse ). The question about platform support is different from the earlier question about operating systems in that it includes several varieties of each system. For example, you can specify that the product support Windows XP Home Edition but not Windows XP Professional, and so on.
Where are the driver packages for each operating system? In answering this question, you supply, for each operating system, the name of a directory tree that contains all the files that will be covered by the eventual signature file. That is, the directory tree includes the INF file or files and all files installed by those INF files. The easiest and best way to perform this step is to create a directory tree that mirrors your distribution media layout and contains all the files, and nothing but the files, that are destined for the end user s machine. As part of this step, you also get to specify the languages supported by each driver package.
Where are the test logs for each operating system? You can t have more than one in the same directory because of filename conflicts, by the way.
For which hardware (PnP) IDs do you want Microsoft to distribute drivers via Windows Update? There are additional requirements to using Windows Update, by the way, but this step is one not to miss. See Figure 15-24.
Figure 15-24. Specifying PnP IDs for Windows Update.
How will the driver be distributed? There are many choices, all of which are contingent on you having the right to distribute the driver. See Figure 15-25.
Figure 15-25. Specifying driver distribution channels.
How do you intend to pay for the testing? Say what? You mean this isn t free? At the time I m writing this, my test submission would have cost $250 for each of the operating systems (Windows XP, Windows 2000, and Windows Me) that I claimed to support, for a total of $750. The cost for retest submissions is the same, so you don t want to submit obviously flawed packages.
Where do you want the testing performed? There are several WQHL testing sites around the world. In the United States, you d pick the one in Redmond. This question is actually relevant only if your submission requires hardware. Mine didn t. (See Figure 15-26.) In fact, most WHQL tests at the time I m writing this are self-test programs and don t require you to submit hardware.
Figure 15-26. The packing list.
Does the party of the first part (hereinafter known as the party of the first part) agree that, and so on? Yes, there is a legal agreement that you have to sign.
Is this your final answer? That is, do you need to correct the driver and test log locations you specified earlier? Speak now, or forever hold your peace .
Ta-da! (See Figure 15-27). You re done. You can digitally sign your submission package and upload it to WHQL. This is where I had to stop. Not only did I have a submission package with fatal omissions in it, but also I didn t have (and didn t want to go to the considerable trouble of obtaining) a VeriSign ID. If I get much more stubborn and independent, I ll have to move to a cabin in Idaho and use the Internet with a dial-up modem the way our pioneer forebeings did.
Figure 15-27. Ready to sign and submit.
I learned a few tricks in the process of running through the Web forms for the first time. As I mentioned, you want to be sure to have all the distribution packages and test results handy. You have plenty of time to finish the process, but the Web application will time out after about an hour so don t plan on having a power lunch in between steps. Some of the choices you make can t be undone except by backing up. Choosing the wrong directories for certain options can add hours to the process if it forces the application to navigate large directory trees in its search for files. The forms warn you about the last two of these gotchas, so I don t think you re likely to go wrong.