Your adrenaline is moving out of your system now, the class is over. How did you do? Most trainers can "sense" a general feel in the presentation and participation from the learners, but what were they really thinking? Did they get it? Can they do their job better now than they could before they started the training?
If you are a trainer, chances are you were hired for a specific job: making sure learners work better/faster/smarter. As with any other job, chances are your boss will want a full accounting of your performance in this area. How can you prove that you have accomplished your goal in a way that's measureable, and easy to understand? You do this through evaluation.
There are a number of ways you can evaluate the success of your training, depending on how much time you have to prove your worth to the company. There are the direct, timely methods, and there are indirect methods as well. Let's take a look at them both, and see which is best for you.
Direct Evaluation Methods
These are commonly called "Tests", "Assessments", and "Surveys". Basically, you check to see how well the learner had performed at the beginning of the course, give them quick tests in the middle of the course to see if they understand each of the modules you are presenting, and then have a final exam that tests overall comprehension. This is probably the most traditional method of evaluation, and everyone is pretty much familar with it. But it only looks at a small snap-shot of the learner's abilities. You don't know if the targeted skills are going to be applied.
A real bonus from this method, particularly from the survey, is that you can get a feel for your development and implementation of the course. How did it appeal to your learners? How are you doing as a presenter? There are a number of things that you can learn that will add to your ADDIE development through this method, outside of just whether or not the analysis was correct.
Indirect Evaluation Methods
Indirect evaluation methods would include monitoring employee performance over a long period of time, focus on overall numbers and how they relate to the skill that needed to be taught. Is there an improvement? Did it warrant the devotion of resources?
For those who are familar with any type of research, this should be nothing new. Researching the results of a change is part of what analysts do, and makes them so valuable to companies (mostly because it's so boring no one else wants to do it ^_^). But what do you analyze? Focus on the results as compared to your initial needs analysis. Did the numbers you focused on for your initial analysis change? Did they change for the better? Where there other factors involved that were not initially recognized?
For those trainers that are caught in the political arena within your company and were forced to create a training program to compensate for non-skill related issues, this is a perfect time to emphasize that while the skill became better known, the outcome did not improve because of the x and y factors. If you provide the information in a scientific way, showing that even though the training was a success the solution failed to be realized, the management will often concede, or let you go, which would also be an acceptable alternative. Who wants to be blamed for someone else's incompetence?
Seriously though, it's a good method to see how effective your training was, your analysis was, and how well each of the learners assimilated the information. You learn how well things are going, how you can improve your teaching style, and therefore increase your effectiveness as an instructor. A success here will validate your work, give you a great promotion, raise, and a chance to win a free 2 week vacation in the Bahamas! ^_^
When to Use Your Evaluation Style
Neither evaluation method is perfect on it's own, so combining both is essential for a full view into how well you are doing. Use a quick assessment at the beginning of the course to find out where your learners are (if that is in question). Once you know, have them keep their scores for future comparisons and self-evaluation. Also have an after-class evaluation that is done anonymously away from the classroom environment. This way the instructor doesn't have a presence to influence the outcome of the evaluation.
Then, send two more evaluations, once after 3 weeks, and one after 2 months. This way you can find out how well the content is remembered, and what the percentage of recall is for the learners. This is good long term data to be gathering. And finally, spend some time doing indirect evaluations by checking performance numbers. Of course, this assumes that you have access to the information. If you don't, you may want to provide a quick spreadsheet to the company that contracted your services so that they can provide the final data to you. They can leave out any information that may be proprietary and still provide enough information to let you know if you have been successful in your endeavors.
So, that finishes this segment of the ADDIE program. I may post some additional information on the subject, but for now, I wish all of you good luck in your training development!
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment