Training evaluation should include
The effort that is put into the design of any evaluation will pay rich dividends, but defining the right questions is always the key starting point. There are degrees of correctness of definition but this should always be something that is measurable and possible within the time and cost frame you actually have adapted from CROMPTON The further down you go in the evaluation process, the more valid the evaluation.
Even though reaching level 4 is the most desired result from an evaluation process, it is usually the most difficult to accomplish. Evaluating effectiveness often involves the use of key performance measures. Be selective! Do not hand out the learner a huge list of questions. Work out what you really want to know and the best way of finding this out. Be realistic! Form-filling is never fun. So do not expect people to conscientiously work their way through a long and complex evaluation form.
Be creative! Why not evaluating with an activity that is itself engaging and enjoyable! Create evaluative processes that will engage participants and provide you at the same time with a valid feedback.
Be balanced! You may develop a standardised evaluation process in order to monitor results over time. However, by asking the same questions, you are always looking at courses from the same perspective. Be holistic!
After a course in which people have gained a whole range of experiences, it is not realistic to expect anyone to express their true evaluation of a course on a piece of paper. Paper exercises can be very useful but it should be seen as part of a much wider evaluation process that includes dimensions of learning that are less easy to capture on paper.
Training should always incorporate an evaluation process in order to analyse and to learn which elements have successfully achieved their objectives and which have failed their purpose. The process can cover only the first level of evaluation, when time and costs are restricted for conducting a comprehensive evaluation means including all the 4 levels.
Regarding the applicability of the evaluation, you should consider which techniques and methods are the most appropriate for the intended purpose. It is important to keep in mind the advantages and challenges of the chosen tools, before applying them in the evaluation process. This guide aims to give an overview about the principles of evaluation while mentioning advantages and disadvantages of specific methods. This paper is a practical guide for lecturers interested in evaluating materials for their effectiveness in achieving specific learning objectives.
This paper provides the reader a short overview of the four levels of evaluation published by Kirkpatrick in This article highlights the importance of using different approaches for different stakeholders in the evaluation process. It measures the return on investment of the training. Although there is no direct way to attribute training to business performance, the Phillips V-model uses specific metrics and measures to give as close a viewpoint as possible.
For example, using cost-benefit analysis for a training program can determine if the money invested in training has had any impact. Are there any measurable results? The benefits of training are clear in this case. Of course, it would be important to take other considerations in place specific to your organization and program.
Generally, not all of your training efforts will be evaluated at all five levels of the Phillips V-model. That would be a waste of resources. As an example, you would evaluate a two-year executive leadership training program at all five levels.
This is because it is high-impact, high complexity, and of high value. However, you may not evaluate, for example, a training program that only focuses on three employees in developing a new way of replying to customer emails. Phillips recommends evaluating all learning projects on the first level and then assess based on need and importance which levels the training program needs to be evaluated. For example, you may determine that you need to evaluate learning up until level 3.
As a reminder of the evaluation targets of the Phillips V-model, look at the diagram below:. Again, your determination of which level to measure should be based on budget and business need. The levels of evaluation happen at different times, as demonstrated in the following table:.
The exmples of tangible and intangible outcomes of learning would look something like this Phillips :. Once you have chosen the level at which you should evaluate your training program, you can now determine which training evaluation tool to use. You can use one tool or a combination of methods.
You may use one of these tools or a combination of them, depending on the level of evaluation, on your budget and need. You need to combine and analyze the data from different sources, which will help you come to meaningful conclusions and determine the effects of a training program. In order to give the data meaning, you need to represent it in a meaningful way. This may be through developing a report, which includes charts and comments.
It could also mean compiling video analysis and texts. You need to ensure it is checked for accuracy and produced without any preconceived objective in mind. You need to think about how you will visualize the data because it will be necessary for decision-making, after all. You can report the results to key stakeholders management as well as employees. This is where level 5 of the Phillips V-model becomes so important to show ROI on the training that was invested in.
This section in your learner satisfaction survey asks questions about the way the course content was delivered. This element also includes the conciseness and clarity of content. Was the subject matter easy for the learners to follow and understand? Was the wording of any written materials clear?
Were the multimedia materials clearly visible and audible? For face-to-face or synchronous training, this part of the survey also includes the manner in which the lesson was facilitated. Was the trainer knowledgeable in the subject matter? Was the facilitator able to explain the topic in a way that the audience can relate to? Did the instructor offer enough support to the learners during the session? These are some of the questions you can include in your post-training evaluation questionnaire.
Any type of course-related material like participant handouts, presentation slides, or multimedia fall under this aspect. This is a critical component — particularly for eLearning courses — because it tells you if your interface, learning environments, and multimedia resources are aesthetically appealing. Thus, having well-crafted multimedia resources also matter — especially for self-paced, asynchronous courses. When you solicit feedback , you can ask your learners to rate the multimedia components video, audio, and images of the program based on how attractive they think those are.
No one wants to just be a passenger in their own learning; learner engagement must be a priority for any type of course.
0コメント