Researching Technology


Researching Technology

We wanted to convert Rockwell Collins's classroom-based approach to a largely technology-based model, with goals to convert 30 percent of courses to alternative forms the first year, 50 percent the second year, and 70 percent the third year. This would allow us to achieve our goal to provide 400 percent more learning with 24/7 global accessibility at 40 percent less cost to the company.

With those goals set, we explored the hardware and software required to update the existing system so that it could deliver robust technology-based training. We evaluated the existing infrastructure with the help of Steve Junion, our training-technology expert, and made a plan to update technology where gaps existed. For instance, in our research we discovered that half of the end- user computers at Rockwell Collins didn't have sound cards ”not because they didn't come with sound cards but because management had them removed. They didn't want certain employee groups to have access to computer-based sound because it would be easier for them to waste time playing games ”another telling element of the Rockwell Collins culture.

Along with installing sound cards in every computer, we planned to buy hundreds of headsets, complete with microphones. The employees would need the headsets if they were going to take online courses at work, and they would need the microphones if they were going to participate in any of the virtual-classroom technology that we knew we'd be acquiring.

Once we had dealt with the infrastructure, we were ready to examine the technology-based training tools available on the market. The existing learning and development staff had spent the previous months researching the latest technology trends in the training industry. So, by the time we reached this phase of the process, not only was our team educated on the vendors and tools available, but they had compiled materials about the most effective hardware and software on the market.

Using the information and knowledge they collected, we made a list of the types of tools that our team would need to deliver and maintain computer-based training, including virtual-classroom software, self-paced e-learning courses, tracking tools, learning-management software, authoring tools, and Webcasting materials.



Evaluating Learning Management Systems: A Three-Phase Process

We didn't as a team just go out and watch product demos; we developed a multi-tiered, criteria-based process for reviewing and testing tools. The team did front-end needs analyses to determine the features and tasks expected from each tool. Then the team established sets of features for each type of technology by brainstorming with the training and development team, our technology expert, and representatives of the business units. No tool would be purchased without a system of qualifications. Each vendor and product would be compared with the others and against a mapped set of requirements. Everything in our vendor-selection process was criteria based.

You can't choose the right vendor or a product until you know what you need and what's important to you. Establishing criteria gives you a quantifiable list of functions that allow you to easily rate the validity of a product's features.

Phase One: Set Your Criteria

To simplify the process and ensure we made the right hardware and software choices the first time around, the team created features checklists for each tool. As the first step in the tool-selection process, team members were given a list of generic options that each tool was capable of and asked to rate them as "required," "optional," or "not required." This process made it easier for team members to understand what the tools were capable of and to rank the value of each feature. It also supplied us with organized, quantifiable data.

For example, in the first phase of the needs analysis for a learning-management system, seven Rockwell Collins managers, including people from the learning and development staff and units across the company, reviewed a common set of tool characteristics and functions to define and prioritize their own business requirements. They compiled a list of 168 tool functions across nine categories, including items such as tracking computer-based training, multimedia station set-up , HTML report generation, and cost tracking.

Phase Two: Prioritize

We asked our team to evaluate the tool functions and prioritize the overall categories according to importance. For the learning-management system, the categories were rated in the following order:

  1. Employee access, which includes functions such as online evaluation, password change, and supervisor access

  2. Self-paced instruction, including electronic-signature capabilities, CBT scheduling, and time tracking

  3. Reporting capabilities, including attendance history, report generation, and reporting of quality certification

  4. Library and materials management, including material-ordering options, waiting list, and check-out features

  5. Skills and performance management, including certification options, gap analysis, and individual development plans

  6. Equipment, including inventory management

  7. Classroom-instruction tools, including walk-up attendance options

  8. Facilities, including classrooms and computer-room management

  9. Miscellaneous, including event notification and certificate generation

This process was critical because our intention was not to choose the tool that offered the most features but to determine which tool could deliver the functionality that was important to us for the best price.

Most companies don't use half of the features they pay for in these tools. We had no intention of paying for things the company didn't need. Extra features just make a complicated tool even more difficult to use. Our goal was to build a simple, cost-effective system that served our needs without a lot of extras.

Phase Three: Rank Your Vendors

Using the information we'd obtained, we sent an initial checklist to five vendors with the 168 tool functions asking them to indicate whether they offered these features now or planned to offer them in the near future. Each answer received a numeric rating, which allowed us to rank the top five vendors based on the percentage of our required tool functions they offered. Only those who offered the feature at that time received a positive rating.

Our team found that the top two vendors offered 90 percent or more of the features on our list, but they were also the most expensive; the lowest -ranking vendor offered only 33 percent of the features but was the most affordable.

This simple front-end evaluation process made it possible for us to zero in on the vendors who were most likely to deliver what the company needed within our budget, and we did it in a relatively short amount of time. Without this ranking, we would have approached every vendor on equal ground. We would have wasted valuable time reviewing products that wouldn't meet our needs or that fell outside of our budget constraints, or, worse , we might have bought one of them without being aware of its shortcomings.