Training Feedback Survey Questions: Write Good Questions with These Tips

2019.06.29 Jonathan Deller
training feedback survey questions

You’ve wrapped up your training session and want to find out what people thought.

Now all you need to do is write some training feedback survey questions.

Seems simple enough, doesn’t it?

Think again. Writing good survey questions isn’t as easy as you may think. Even the simplest of questions can be misread, misinterpreted, or simply left unanswered.

In this post, we’ll offer a range of tips for writing good training survey feedback questions. Following our pointers will help you craft strong questions that make your survey as successful as possible and get you the data you need to improve the quality of your training.

Tip 1. Create a learning impact map

To write effective training survey feedback questions, the first thing you should do is get your learning impact map in place.

Step 1: Identify the objectives of your training and the desired results.

Step 2: Outline the employee behaviors that will help deliver these results.

Step 3: Identify the knowledge, attitudes, and intentions that will drive this behavior.

Step 4: Define the measurable objectives of the program.

Creating a learning impact map will force you to keep the end result of your training in mind. This prevents you from writing irrelevant questions. This, in turn, should boost the response rate of your survey too.

Tip 2. Give your survey a clear structure

The best way to writing good training survey feedback questions is to give your survey a clear structure. The structure is about how you present the questions and the order you put them in. It has a big impact on the value of the questions and whether they are answered, or skipped over.

A good structure is to choose your highest-level metrics first. This makes it far more likely that they’ll be answered, even if the survey isn’t fully completed.

For example, suppose you are writing questions for a survey about the content of a training course. The goal of the survey would be to find out how trainees felt about the content. You might use such a survey if you are following the Kirkpatrick model of training evaluation. 

  • Was it pitched at the right level?
  • Was it valuable?
  • Did it make an impact?

The questions you ask about these metrics should be placed first in the survey. This gives your survey a solid structure that will help ensure that the survey meets its goals.

Tip 3. Use conversational language

Write your questions with simple, conversational language that the average reader will understand with ease. Testing out your questions on friends or colleagues is especially helpful as it forces you to use plain, everyday language that is readily understood. 

Training survey feedback questions

 

Tip 4. Pose questions with the least possible number of words

The problem with writing long questions is that people tend to skip over your important words! This increases the likelihood that your question will be misinterpreted or even skipped completely. The solution? Remove anything complicated from your questions and make them as simple as possible. Your goal should be to write each question in the simplest possible way.

Tip 5. Replace specialized words with simpler alternatives

As a rule of thumb, if a word has more than seven letters, there’s probably a simpler word that could replace it. Try to state your questions using the simplest possible words. This will greatly minimize the chances of respondents misinterpreting or misunderstanding the questions.

Words that may mean different things to different people include:

  • many
  • several
  • most
  • numerous

Words with clearer, less ambiguous meanings include:

  • almost none
  • almost all
  • the majority of

Tip 6. Avoid questions that rely on long-term memory

Most of the questions in your training feedback survey will be focused on events that happened relatively recently, ie. During the training. However, asking respondents about events that happened in the distant past may provide low-quality data and be of limited value.

For example, you may get solid answers if you ask how many training courses people have attended over the past year. But asking how many times they have studied a particular topic may give you far less accurate answers.
 

Evaluate training effectiveness

 

Tip 7. Limit the number of ranking options to 6

Ranking questions work well for training feedback surveys as you can ask participants to rank items in order of importance or preference. Although it’s difficult to put an exact number on how many ranking options you should provide, in our experience offering more than six options isn’t advisable.

For example, if you want to get feedback on a customer service training course and want to know the average number of calls that the trainees can handle in an hour, you’ll want to include the 5-6 most likely responses and not an exhaustive list of 12-15 possibilities.

You can always provide an ‘Other’ option to collect data from respondents whose answers don’t fit within the options you provide. For example, Kodo Survey has the alternative “Don’t know” for single-choice and multiple-choice questions. This encourages participants to select this option instead of guessing the answer to the questions.

If you have more than six items, consider splitting the question into two parts. This will improve your survey’s response rate and reduce the chance of people abandoning the survey.

Tip 8. Always ask questions in complete sentences

Always phrase your training survey feedback questions in complete sentences to ensure that they are interpreted in the same way.

For example, if you are writing a survey to gauge people’s reactions to a training session, you may want to know what they thought about the course content. Was it too easy? Too difficult? Too short? Or too long?

A bad question would be:
Number of modules completed: __________________

A better question would be:
How many training modules did you complete? ____________

Tip 9. Avoid double-barreled questions

Asking double-barreled questions – two questions in one – is an easy mistake to make.

For example, if you asked respondents “Where you satisfied with the course content and delivery?” this would be asking them two separate things instead of one.

It’s much better to break this question into two parts:

  • Were you satisfied with the course content?
  • Where you satisfied with the course delivery?

Asking two separate questions will give you more accurate and reliable answers.

Tip 10. Avoid vague qualifiers

When writing multiple choice questions, it can be tempting to use vague qualifiers. This makes the question quicker and easier to write, and it makes the results simpler to process.

For example:
How many training courses did you attend in the past year?

  • None
  • A few
  • Many

While this question ticks all the boxes for being simple, unambiguous and clear, the data it generates likely won’t be very useful. If respondents feel that a question has vague qualifiers, they’ll probably skip over it, or just fill in any old answer.

A better set of qualifiers would be:
How many training courses did you attend in the past year?

  • None
  • One or two
  • Three or four
  • More than four

When precise estimates are available, always use those. This approach will give you more useful data.

Tip 11. For multiple choice questions, cover all options without overlapping

When writing a multiple choice question that can only have one answer, make sure your options don’t overlap as this will skew your results.

For example, a training feedback survey question might ask the respondents:

How many years of experience do you have?

  • 1 to 5 years
  • 5 to 10 years
  • 10 to 15 years
  • 15 or more years

Given these choices, respondents with 5, 10, and 15 years of experience would have two possible response option and this will affect the reliability of your results.

Tip 12. If you mention the response scale in your question, make sure to include both sides

Good training feedback survey questions must be neutral and free from bias. One error that often goes unnoticed is failing to include both sides of the response question in your question.

An example of a biased survey question would be: ‘How satisfied were you with the training?’

A more neutral question would be ‘How satisfied or dissatisfied were you with the training?’

An even better course of action is to remove any mention of the response question.

Please rate the training you received.

  • Very dissatisfied
  • Dissatisfied
  • Neither satisfied nor dissatisfied
  • Satisfied
  • Very satisfied

Tip 13. Balance the number of positive and negative response options

Bias is one of the most difficult things to detect when writing training feedback survey questions. An often overlooked way that bias can be introduced is failing to balance out the number of positive and negative options from multiple choice questions.

For example, if you want to gauge the reaction of trainees to an instructor, a bad question to ask would be:

How satisfied or dissatisfied were you with the course instructor?

  • Very satisfied
  • Mostly satisfied
  • Somewhat satisfied
  • Neither satisfied nor dissatisfied
  • Dissatisfied

A person looking at this would assume that ‘Somewhat satisfied’ is in the middle of the scale. However, the option ‘Neither satisfied nor dissatisfied’ should actually be the middle option, as in the following question:

How satisfied or dissatisfied were you with the course instructor?

  • Very satisfied
  • Mostly satisfied
  • Somewhat satisfied
  • Neither satisfied nor dissatisfied
  • Somewhat dissatisfied
  • Very dissatisfied

Balancing out the number of positive and negative responses is a good way to remove bias from your survey.

Tip 15. Always distinguish between “Undecided” and “Neutral”

Another mistake that can skew your training feedback survey results is omitting “Undecided” as an option from your survey. Placing the ‘Undecided’ option at the end of the scale helps distinguish it from the ‘neutral’ response such as ‘Neither satisfied nor dissatisfied’.  

Tip 16. Don’t develop answer categories that are mutually exclusive

As easy mistake to make is writing answer categories that are mutually exclusive. This can affect the reliability of your survey data and may result in the question being skipped entirely if respondents feel that no category applies to them.

A bad question would be:

How did you first learn about this training?

  • Website
  • Company
  • Email
  • Colleague
  • Advertisement

Imagine that a respondent found out about the training when a colleague emailed them the details. Which option would they choose?

What if they saw an advertisement on a website?

To avoid mutually exclusive categories, you can provide an ‘Other’ option for people to write their specific answer.

Tip 17. Avoid asking respondents to answer yes to mean no

Our final tip is to avoid writing questions that require a positive response such as ‘yes’ to infer a negative answer.

For example: Would you agree that this training was not a worthwhile endeavor?

  • Yes
  • No

This could cause confusion for respondents. It is better to state the question so that the response matches their intention.

For example:

Do you feel the training was worthwhile?

  • Yes
  • No

Tip 18. Continually test your questions to standardize them

One of the main goals of any training feedback survey question should be for any potential respondent to interpret the questions in the same way. To do that, you must try to put yourself in the shoes of your potential respondents. Check that the meaning of your questions is clear and unambiguous.

During and after the writing process, test your questions on friends, colleagues or coworkers and iron out any problems that arise. Reword, rephrase or revise any unclear questions until you are reasonably confident that they are standardized.

Tip 19. Make the options look equally likely to be correct

With single-choice and multiple-choice questions, it’s important that the alternative answers look equally correct to the ‘untrained eye’.

One tip is to make the alternatives contain approximately the same number of words or symbols as the real answer. Secondly, you should make the content look equally likely.

A poor single-choice question would be:

What is the chemical composition of nail polish remover?

  • Chocolate
  • Ice cream
  • Candy
  • Acetone

A good question would contain four alternatives that look roughly correct. It would also have a “Don’t know” option to dissuade respondents from guessing.

What is the main chemical in nail polish remover?

  • Formaldehyde
  • Phthalates
  • Acetone
  • Toluene
  • Benzophenones
  • Don’t know

Tip 20. Match the question to the taxonomy level of the learning objective

Using Bloom’s taxonomy as a guide, different types of questions need to be defined depending on the taxonomy level of the learning objective it is supposed to measure. For instance, if you are writing survey questions for a coaching course and are targeting Taxonomy 1 (Remembering), you would want to write questions that match this level.

A good example of a single-choice question aimed at taxonomy 1 would be:

“What does the acronym GROW stand for?”

Respondents could select one from a range of answers to indicate their ability to remember this fact.

Ready to write your training feedback survey questions?

Following our tips should help you write effective training survey feedback questions that get you solid data. However, we still recommend testing your survey first on a small sample before rolling it out.
Good luck!

To find out more, download our white paper, Determining and optimizing the impact of your training and development, or book a meeting with one of our experts.

Related Posts

Want to know more?

Learning
Determine learning impact, learning transfer and behavioral development in an easy way.
Expose
Expose gaps so that you can improve quality and maximize effect both before and after a programme.
Data
Get data-driven insights, enabling you to increase course or portfolio performance and reduce cost.

Now that we have used Kodo for a while, we see how easy it is to follow the learning impact and transfer of learning to the workplace. The insights we receive help us to continuously improve courses and programmes.

Kristoffer
Kristoffer Laag
HR Strategist