Once we established what the vendors could provide, our team was able to further narrow the search through criteria comparisons. We did side-by-side comparisons of products to see which ones offered the most valuable features. For example, our criteria for virtual-classroom software included forty-three features, such as raising of hands, Yes/No responses, step out and return options, instructor and student "do not disturb" buttons , text chat, and whiteboard.
We used these criteria to create requests for proposals from virtual-conference software vendors, then rated the products in side-by-side comparisons. When one product's capabilities outweighed the other's, that product received a point. When both products could do the task equally well, they received an "equal" rating. The product with the most points won that round.
Figure 5-1 shows a segment of the criteria sheet comparing LearnLinc and Centra virtual-classroom software features. Along with end- user features shown in the figure, the criteria covered issues such as architecture, bandwidth, support offerings, licensing, and standards compliance. When all forty-six categories were totaled, Centra was favored twenty-three to three. The rest of the categories were rated "equal."
Using a point system takes the emotion out of the process of choosing a vendor. It allows us to quickly research and organize the top products in the market so we know what we want and who can give it to us. By creating criteria for every piece of technology, our team was able to narrow the field and communicate to vendors exactly what we wanted and how we expected it to be delivered, which vastly reduced our chance of failure.
Before our team made any decisions about purchasing tools, we wanted outside opinions of the products being considered . It's our experience that vendors vastly oversell their products, which is why this phase of the review process is important. You cannot just assume that what the vendor says is accurate. You need proof.
We contacted customers of the tool vendors to ask how satisfied they were with the tool, the service, and the support. In some cases, vendors could not provide us with any satisfied customers to contact, which of course made us extremely skeptical. If your vendor doesn't have a success story to share, assume that yours won't be the first. And even among those who did provide contacts, we were still somewhat suspicious. Vendors frequently offer deals to customers in exchange for their willingness to gush about a product's capabilities, so any offered contacts were met with hesitation.
To build more honesty and integrity into this review process, we turned to our own business networks to find peers using these tools who had not been recommended by the vendors. With this back-door approach, we found as many dissatisfied customers as satisfied ones, although admittedly not every product failure was the fault of the vendor. Many users had no business rationale, beyond a desire to do e-learning, for purchasing the tools they were using.
To weed out those without a relevant business case for the purchase, we asked one simple question: "What strategic business goal lay behind your choosing this technology?" If they couldn't answer the question, then it was likely they had no idea what they were doing and any tool failure was undoubtedly the result of their own inefficient processes.