As you proceed, remind yourself that problems are to be expected and are acceptable. Even if something is committed to paper, and even if you've pitched an idea vehemently in your business case, if it doesn't work, be prepared to change it or scrap it. Don't waste effort on initiatives that won't work. You will lose credibility throughout the company and create hostility where you once found allegiance.
But beyond scrapping those things that fail, you also need to constantly assess and modify your processes and plan. Elements of the project that are working will still require tweaking; certain employee groups will need more attention than others to get them involved in the project; and certain elements of the plan will need to be changed or tossed aside if they aren't working. This is a time to be flexible, to be introspective, and to be honest about your successes and failures. It's a time to constantly evaluate your choices, and to pinpoint what's working and what's not.
There was an instance where a Rockwell Collins manager said that we were making an inaccurate assessment of the viability of e-learning and that the entire strategy was doomed to failure. She went on to suggest that if we wanted to look good, we should choose a different approach. Looking good was more important to some in the culture than doing the right thing. We chose to do the right thing.
Your learning-management system will help you assess course usage and success rates of employees . For some companies, usage alone or time spent in each course will be an acceptable reflection of the value of a course. You may want to track modules completed, time per page, or test scores.
Some companies only track completion rates; however, we believe that with Web-based training, the focus should be primarily on learning acquired , not final tests completed. The idea is to deliver knowledge in a fast, efficient package, not to force users to take modules they don't need in order to provide you with trackable test scores.
We reject the notion that people must complete online courses before they count. The popular trade media are full of articles bemoaning the 25 percent completion rate that most learning-management systems record. The truth of the matter is that we think that's about right.
Research clearly shows that adults learn differently from the way children do. One of the key differences is that adults simply acquire what they need and utilize that knowledge. It is completely logical to assume that people go online, learn what they need, and then exit. They don't spend time going over material they already know. Mandated courses and forced completions serve only to frustrate adult learners and turn them off to the education process.
Don't let your success be measured by completion rates. The real measure of success is whether the employee base has the skill and knowledge to do their jobs properly. If you let senior management hold you accountable for an arbitrary completion rate, you are allowing managers and supervisors to avoid their responsibility ... namely, assessing their direct reports ' performance; providing clear, unambiguous feedback; and correcting any performance deficiencies.
The commitment to completion rates is a cultural issue that you may or may not choose to do battle over. Regardless, usage rates, whether they monitor time spent or courses completed, will provide you with valuable information regarding two elements of your project. They will allow you to monitor which groups of employees are taking advantage of training and which ones aren't. You can use that information to plan your marketing campaign and to schedule user presentations.
Whenever our data shows a drop in use within a group or functional unit, we deliver our Project Oasis presentation again to reintroduce the curriculum and our enterprise rationale. We tell executives why this system is important to the future of the company, how it ties to the business vision of the company, what needs it meets, and how to use the system. Sometimes we need to reintroduce groups to the training because their needs have changed so much in two years. Sometimes they have a large crop of new employees or Rockwell Collins acquires a company and the new population is unfamiliar with the system. Even years after our initial launch, a constant process of monitoring and reacting is required to maintain a balance of users throughout the organization. Any project will die on the vine if the marketing stops.
We also use usage rates to prove our success with new courses and the system as a whole. That can come in very handy when you are inundated with complaints from a small minority who can't use the system. And it's powerful data with which to verify that you hit or exceeded your targets. For example, in Year Three (2002) our team set a goal of having 21,000 courses accessed. At the time we set the goal, we thought it was a stretch and that it would take a lot of work and marketing, but we wanted to push ourselves to achieve it. By June of that year we already had 36,000 users and expected to have double that by the end of the year. That's a great indicator to us and our supporters that the program is valuable and useful to the employees.
If you aren't succeeding, your usage rates will tell you that as well. They show you if particular courses are underutilized , which may tell you that there isn't a need for that topic or that people find it confusing or of little use. Learning-management system report rates indicating a lack of use across employee groups is the first sign that something is wrong with a course or set of courses. From there you can survey end users, evaluate content, and reassess the process you used to choose the content in the first place.
Every quarter, the learning and development team gets rid of the thirty least-used courses and replaces them with thirty new ones. This keeps content fresh, it gives us new courses to announce and market, it keeps users curious about what's available, and it ensures that we continue to offer the highest-quality courses available.