One of the most common ways of evaluating the effectiveness of a training course is to survey the participants. Surveys are the simplest and most time- and cost-effective ways of establishing whether the conditions were right for learning to take place. In this post, we’ll examine the concepts behind giving surveys and offer ten sample survey questions that could form part of a training effectiveness survey.
Why offer a survey?
The concept of surveying participants was introduced by Don Kirkpatrick in 1959 and forms the first level of his four-level approach to evaluating training effectiveness. Kirkpatrick’s system, known as the Kirkpatrick Training Evaluation Model, offers a comprehensive way of assessing the effectiveness of training.
Level 1 – known as ‘participant reaction’ – attempts to establish whether a training program created the right conditions for learning. Level 1 evaluations provide indicators of whether the participants think the right conditions for learning were created. They are sometimes used to indicate whether the participants feel that learning took place, and to what extent it might be useful for their work. However, the actual learning is a assessed by Level 2 evaluations.
Surveys are popular as they are easy to write, simple to deploy and have a high-response and completion rate. Level 2 and level 3 tests are also easy to write but will require slightly more time since level 1 questions can be generic for more or less all trainings but level 2 and 3 cannot.
What can a Level 1 survey measure?
Surveys are a reliable and effective way of capturing participants’ reactions to things like:
The training course or program
The training methods
The course instructors or trainers
The assessments methods
The administration of the training
Participants’ responses to a survey can help create a picture of how effective the training was. The data can flag up certain areas that could have contributed to the success or failure of a training course and help identify ways in which future training could be improved.
What can a survey tell us?
A survey can offer valuable data about how participants responded to training. It can help identify things such as:
Which courses were popular and therefore likely to be well-attended?
Which trainers or course providers were well-liked?
Were there any barriers to learning?
Were there any clues as to how future training sessions could be improved?
Which learning needs did the training fail to meet?
What does this look like in practice?
Let’s imagine that a high number of survey respondents indicated that the contents of a particular training course were irrelevant or unsuitable. This feedback can be used to make adjustments and improvements when running the training in the future.
Similarly, if the venue or facilities received poor feedback, you could take measures to address these for future training sessions.
Are surveys worthwhile?
Some academics and researchers experts have claimed that because participant responses lack objectivity they aren’t a reliable way of evaluating the effectiveness of training. This is a fair criticism and that’s why participant surveys are just one part of a much larger process of training evaluation.
The Kirkpatrick model, for instance, has four levels:
Level 1: Reaction
Level 2: Learning
Level 3: Behavior
Level 4: Results
When you survey participants for a level 1 evaluation, you need to view these results alongside level 2 (learning) data that examines what they actually learned.
If the level 2 evaluations indicate that learning didn’t take place on a particular training course, the level 1 survey would flag up which aspects of the course scored poorly among participants. This would help you understand the specific areas that should be improved.
A more specific criticism of participant surveys was leveled by researcher Michael Gessler in 2009. Working for the University of Bremen’s Institute of Technology and Education, Gessler applied the Kirkpatrick training evaluation model to 43 training courses and found “no correlation between the reaction (level 1) and the learning (level 2)” stages.
Because there are a whole host of unrelated reasons why participants may positively or negatively react to any given training session. Gessler’s empirical research showed that these factors are largely unrelated to whether or not learning takes place.
For example, an extremely personable and lively instructor may capture participants’ attention and garner favorable survey responses. Yet the content that they deliver may fail to produce the intended learning outcomes.
According to Gessler, “the practice of evaluating professional training based on participant satisfaction requires further development.” This research is extremely important for anyone who plans to write survey questions. It emphasizes the need to ask relevant questions that probe the right areas.
How can I create effective survey questions?
For a training effectiveness survey to be successful, it must contain relevant, well-constructed questions. These questions should take into account three specific factors.
1. What were the stakeholder’s expectations of the training?
The stakeholders in most companies will be the management who decided to implement the training. What expectations did they have?
2. What were the goals of the training?
Every training course must be designed to meet specific goals. The questions you ask in a level 1 evaluation survey should reflect those goals.
3. What are the goals of the evaluation?
To create a successful survey, you must consider the goals of the evaluation. What are you trying to evaluate and why?
While pre-authored example questions can be helpful, you may also need to write your own questions that meet the specific needs of your company or organization.
Effective survey questions should be:
linked to business objectives or stakeholders’ expectations
linked to the training objectives
balanced in number to ensure quality but also high response rates
Free from bias (more about that in our whitepaper for determining the impact of training)
Accommodate all possible answers (multiple choice or open-ended responses)
When writing questions, you must not assume that the respondents know how to answer them. To be valid, each question should be clear so that the learners have a clear idea about how to tackle each question. You should also be sensitive to ethical and moral issues.
10 training effectiveness survey questions to ask
The following ten sample questions should provide a helpful example of the types of questions you may wish to include in a participant response survey.
Question #1. Did the training content meet your expectations?
This question can be answered with a simple ‘Yes’ or ‘No’ check mark. This is a great question to ask as it helps you identify whether the training content matched the participants’ expectations. If you have a course where many respondents indicated that the training failed to meet their expectations, this could indicate a problem with the course content.
If you wish, you could include an optional open-ended question such as ‘Why or why not?’ and provide space for a written answer.
Question #2. Was the size of your training group appropriate?
This question helps illuminate whether the learners felt comfortable in their groups. If the group size was too large, the participants may not feel as though their needs weren’t met. As with the first question, you may include space for a written response. This can help you tailor future training sessions and find the optimal numbers of trainees per course or per session.
Question #3. How would you rate the quality of the training?
A 1 to 5 option (1, 2, 3, 4, 5) with 1 = unacceptable and 5 = outstanding should give you a good idea of how the learners viewed the instruction overall. If a course received many low ratings, you could reasonably assume that the course provider or the content didn’t meet the needs of the learners.
Question #4. Was the mix of presentations and activities suitable?
Most training courses feature a mixture of instructor-led presentation sessions and activities where the trainees work individually or in groups on certain tasks. A presentation-heavy training course may leave attendees feeling as though they lacked time to put what they learned into practice.
This question can be answered with a 1 to 5 multiple-choice option.
Question #5. How would you rate the quality of the instructor?
A 1 to 5 ranking system (1 = unacceptable; 5 = outstanding) would help you identify how the learners felt about the course instructor. Many low ratings may indicate that the instructor wasn’t well suited to delivering the course. Many high ratings would indicate that the learners felt comfortable with the quality of the instructor.
If you wish, you could create other questions that delve deeper into the performance of the instructor.
These may include:
What was the instructor’s level of content knowledge?
How was the speed of delivery?
How would you rate their organization and preparation?
How was their enthusiasm?
You may also want to leave space for handwritten or typed responses with the instruction: ‘Please provide any additional feedback for the instructor.’ This gives participants an opportunity to give praise or offer criticism in ways that multiple-choice responses cannot accommodate.
Question #6. Did you learn anything new?
Respondents can answer this with a ‘Yes’ or ‘No’ option. You can also invite written responses by asking learners, ‘If yes, please provide details’. This will give valuable data as to the areas that the trainees felt offered the most value.
Question #7. Was the training relevant to your needs?
To dig down into the details, you need to understand whether the learners felt the course was a valuable use of their time. This question could invite responses in a multiple-choice format, for example, 0 = irrelevant and 5 = highly relevant. This data gives you a clear idea of whether the participants found the course useful and helpful.
Question #8. Was the course practical and/or easy to apply?
The results from a training effectiveness survey should be used in conjunction with other training evaluations. In later stages, you’ll be assessing whether learning took place and to what extent the training made its way into the workplace. This question helps you understand how the trainees felt about the course. If later assessments found little evidence that participants were putting the training into practice at work, the answers from this question may offer clues as to why that was the case.
Question #9. Would participants recommend the training to colleagues?
A ‘Yes’ or ‘No’ response option is most suitable for this question. High numbers of participants indicating that they wouldn’t recommend a course is a sign that the training failed to live up to expectations or was poorly planned and implemented.
Question #10. Do you have any suggestions to improve this course?
This question is best asked as an open-ended handwritten response. This type of response takes longer to read and interpret but can highly areas that other questions missed. For the participant’s perspective, it’s important that they feel able to express their opinions about a training course in an open and unconstrained manner.
How to improve the user-friendliness of your survey
While the particular questions you ask will depend on the stakeholders’ expectations, the goals of the training and the goals of your evaluation, there are certain techniques you can use to improve the quality and usefulness of your survey.
Balance the number of questions
Keep in mind that the fewer questions you have the higher completion rate you will likely have as well. However, more questions are more likely to render high quality in the reporting later on. Make sure to find the right balance.
Keep the question short
Keep the questions brief and clear. Avoid using abbreviated words or jargon.
Put the questions in a logical order
Structure your questions so that those requiring a simple ‘Yes’ or ‘No’ response come first and more open-ended responses come later.
Keep each question separate
Avoid ‘branching’ questions where one question is dependent on the response to the previous question. Branching will lead to confusion and lower the response and/or completion rate of your survey. Separate each question to make it as clear as possible.
Conduct a pilot test
Before deploying your survey, it’s a good idea to test them on a small sample group. Share the questions with stakeholders to ensure they are linked with their expectations. Conduct a pilot test with people from your target audience. You may contain an additional open-ended question such as ‘Please let us know of any difficulties or complications in completing this survey’. This can give you valuable feedback as to which questions need rewording or revising.
Over to you
Hopefully, these ten training effectiveness survey questions have given you a good starting point for writing your own questions. For more questions to evaluate on Kirkpatrick’s level 1 , download our free form for evaluating training effectiveness.
- How to measure training effectiveness in 4 simple steps
- 3 Best Methods to Evaluate Training Effectiveness
- Four good reasons to evaluate training effectiveness
- Are you spending millions on training without knowing its true effect?
- Why Measuring Training Effectiveness will Soon Become Standard